Connect with us

Edu News

Unpacking black-box models

Researchers create a mathematical framework to evaluate explanations of machine-learning models and quantify how well people understand them

EP Staff

Published

on

Written by Adam Zewe, MIT News Office

Modern machine-learning models, such as neural networks, are often referred to as “black boxes” because they are so complex that even the researchers who design them can’t fully understand how they make predictions.

To provide some insights, researchers use explanation methods that seek to describe individual model decisions. For example, they may highlight words in a movie review that influenced the model’s decision that the review was positive.

But these explanation methods don’t do any good if humans can’t easily understand them, or even misunderstand them. So, MIT researchers created a mathematical framework to formally quantify and evaluate the understandability of explanations for machine-learning models. This can help pinpoint insights about model behavior that might be missed if the researcher is only evaluating a handful of individual explanations to try to understand the entire model.

“With this framework, we can have a very clear picture of not only what we know about the model from these local explanations, but more importantly what we don’t know about it,” says Yilun Zhou, an electrical engineering and computer science graduate student in the Computer Science and Artificial Intelligence Laboratory (CSAIL) and lead author of a paper presenting this framework.

Zhou’s co-authors include Marco Tulio Ribeiro, a senior researcher at Microsoft Research, and senior author Julie Shah, a professor of aeronautics and astronautics and the director of the Interactive Robotics Group in CSAIL. The research will be presented at the Conference of the North American Chapter of the Association for Computational Linguistics.

Understanding local explanations

One way to understand a machine-learning model is to find another model that mimics its predictions but uses transparent reasoning patterns. However, recent neural network models are so complex that this technique usually fails. Instead, researchers resort to using local explanations that focus on individual inputs. Often, these explanations highlight words in the text to signify their importance to one prediction made by the model.

Implicitly, people then generalize these local explanations to overall model behavior. Someone may see that a local explanation method highlighted positive words (like “memorable,” “flawless,” or “charming”) as being the most influential when the model decided a movie review had a positive sentiment. They are then likely to assume that all positive words make positive contributions to a model’s predictions, but that might not always be the case, Zhou says.

The researchers developed a framework, known as ExSum (short for explanation summary), that formalizes those types of claims into rules that can be tested using quantifiable metrics. ExSum evaluates a rule on an entire dataset, rather than just the single instance for which it is constructed.

Researchers use local explanation methods to try and understand how machine learning models make decisions. Even if these explanations are correct, they don’t do any good if humans can’t understand what they mean. MIT researchers have now developed a mathematical framework to quantify and evaluate the understandability of an explanation.
Credits:Image: Courtesy of the researchers

Using a graphical user interface, an individual writes rules that can then be tweaked, tuned, and evaluated. For example, when studying a model that learns to classify movie reviews as positive or negative, one might write a rule that says “negation words have negative saliency,” which means that words like “not,” “no,” and “nothing” contribute negatively to the sentiment of movie reviews.

Using ExSum, the user can see if that rule holds up using three specific metrics: coverage, validity, and sharpness. Coverage measures how broadly applicable the rule is across the entire dataset. Validity highlights the percentage of individual examples that agree with the rule. Sharpness describes how precise the rule is; a highly valid rule could be so generic that it isn’t useful for understanding the model.

Testing assumptions

If a researcher seeks a deeper understanding of how her model is behaving, she can use ExSum to test specific assumptions, Zhou says.

If she suspects her model is discriminative in terms of gender, she could create rules to say that male pronouns have a positive contribution and female pronouns have a negative contribution. If these rules have high validity, it means they are true overall and the model is likely biased.

ExSum can also reveal unexpected information about a model’s behavior. For example, when evaluating the movie review classifier, the researchers were surprised to find that negative words tend to have more pointed and sharper contributions to the model’s decisions than positive words. This could be due to review writers trying to be polite and less blunt when criticizing a film, Zhou explains.

“To really confirm your understanding, you need to evaluate these claims much more rigorously on a lot of instances. This kind of understanding at this fine-grained level, to the best of our knowledge, has never been uncovered in previous works,” he says.

“Going from local explanations to global understanding was a big gap in the literature. ExSum is a good first step at filling that gap,” adds Ribeiro.

Extending the framework

In the future, Zhou hopes to build upon this work by extending the notion of understandability to other criteria and explanation forms, like counterfactual explanations (which indicate how to modify an input to change the model prediction). For now, they focused on feature attribution methods, which describe the individual features a model used to make a decision (like the words in a movie review).

In addition, he wants to further enhance the framework and user interface so people can create rules faster. Writing rules can require hours of human involvement — and some level of human involvement is crucial because humans must ultimately be able to grasp the explanations — but AI assistance could streamline the process.

As he ponders the future of ExSum, Zhou hopes their work highlights a need to shift the way researchers think about machine-learning model explanations.

“Before this work, if you have a correct local explanation, you are done. You have achieved the holy grail of explaining your model. We are proposing this additional dimension of making sure these explanations are understandable. Understandability needs to be another metric for evaluating our explanations,” says Zhou.

This research is supported, in part, by the National Science Foundation.

Edu News

NMIMS SBM Offers MBA (Part-Time) Program for Working Executives

The program focuses on providing a holistic education to enhance their employability

EP Staff

Published

on

Start your professional learning journey now with the best-in-class MBA (Part-Time) program with the most sought-after prestigious NMIMS, School of Business Management, featured in the top 100 global B-schools as per Financial Times MiM 2022 ranking. A highly respected business school in India with a legacy of 41 years and a prestigious faculty announces admissions open for their MBA (Part-Time) program at their Mumbai campus.

This program offers an opportunity for working executives to acquire a high-end compact management qualification through rigorous and qualitative in-class learning and practical exposure to industry expertise. The program focuses on providing a holistic education to enhance their employability, exposing working executives to contemporary trends and practices in management. It also provides excellent academic resources coupled with industry best practices to boost the managerial competence of executives.

NMIMS SBM MBA (Part-Time) is accredited by Advance Collegiate Schools of Business (AACSB) for working executives who want to enhance their skills and advance their careers. NMIMS has world-class pedagogy to provide students with a comprehensive and engaging learning experience. NMIMS SBM has a team of experts with extensive experience in the business world. The program structure is upto 40% hybrid and 60% in the classroom, whereas Bloomberg Certification Program will help students use financial analysis tools more efficiently. 

“The program is specifically meant for executives who have spent quality time in the industry and have adequate exposure to managerial roles and responsibilities. The two-year MBA (Part-Time) program will offer an opportunity for participants to hone their managerial skills and enable them to contribute better to their decision-making. It has been designed to empower students with a well-planned schedule that allows for a balance between study and work,” said Dr. Prashant Mishra, Dean School of Business Management.

Dr. Pradeep Pai, Program Chairperson, MBA (Part-Time), School of Business Management, said, “NMIMS SBM is proud to offer this innovative program to working professionals who wish to take the next step in their careers. We believe that this program will be a preferred executive education program for working professionals seeking to upgrade their qualifications by acquiring a widely acclaimed MBA degree.”

The MBA (Part-Time) program goes beyond traditional education by providing learners value-added workshops and industry connections. Our curriculum is regularly updated to ensure students receive the most current knowledge. The program helps them gain a practical and theoretical understanding of the industry and creates a network of professionals for them as they progress.

Eligibility Criteria:

  • 50% in Graduation from a recognized University in any discipline. (Distance/Part time/Full time)
  • Minimum 3 years of work experience in an executive or supervisory capacity or self-employed after graduation & up to the date of written test/personal interview.
  • The work experience should be full-time experience and should NOT include internships, projects, training periods, trainee (management, engineering), etc.

Selection Process:

Written Test conducted for MBA (Part-Time) by NMIMS OR Candidates with GMAT score of 600 and above (GMAT score of last 5 years up to the closure of registrations will be considered) OR Candidates with a score of 200 and above in NMAT by GMAC examinations for 2020 admission AND Personal Interview

Continue Reading

Edu News

eSecForte visited Northeast to strengthen Industry-Academia Interface for Achieving Effective Cybersecurity Prospects  

The team also visited the Indian Institute of Information Technology (IIIT), Manipur and interacted with students on the topic of Digital Forensics Challenges and career options

EP Staff

Published

on

In an effort to create a holistic ecosystem for cybersecurity in the country, the team from eSecForte led by Lt Col (Dr.) Santosh Khadsare  (Retd), VP Digital Forensics  & Incident Response  visited the National Institute of Electronics and Information Technology (NEILT), Kohima and held discussions with the director of the institute, L. Lanuwabang. The deliberations were focused on enlarging the gambit of cybersecurity especially digital forensics in the country and how eSecForte and NEILT can come together to contribute effectively in this regard. 

The team also visited the Indian Institute of Information Technology (IIIT), Manipur and interacted with students on the topic of Digital Forensics Challenges and career options. The discussion entailed how the use of digital forensics can prove instrumental in bringing the perpetrators of cybercrimes to the book. The various challenges related to the implementation of tools and methodologies of digital forensics were also part of the very constructive dialogue that was held between our team and enthusiastic students. 

The team lead had a very fruitful discussion with Dr Krishnan Bhaskar, Director, IIIT, Imphal and discussed how the Industry-Academia interface can do wonders in the field of cybersecurity and digital forensics. Specifically, detailed deliberations were held on devising exchange mechanisms and collaborative opportunities so that industry and academics can come together and lead to the creation of a self-sustainable ecosystem in the field of cybersecurity. Such development will ensure that cases related to the cybercrimes are dealt swiftly and lead to desired outcomes in terms of ensuring justice and well-being for all participating stakeholders. 

In the next leg of the reaching out journey, eSecForte’s team paid a visit to the National Forensic Science University (NFSU), Imphal campus and exchanged ideas with the campus coordinator on the latest trends and developments in the field of digital forensics. Apprising the importance of digital forensics to investigating officers, Lt Col (Dr.) Santosh Khadsare  (Retd), VP DFIR interacted with more than 400 trainees of the North Eastern Police Academy (NEPA), Shillong. These aspirants were thoroughly apprised of the importance of digital forensics and its utility in cracking cases related to cyberfrauds, cyberbullying, and other allied crimes associated with cyberspace. Case studies were discussed on how to handle digital evidence at the crime scenes with utmost care so that trace evidence won’t get lost in the logistical procedures and processes. The team showcased flagship products of eSecForte, Digital Forensic workstations and Faraday bags and informed them of the utility of these offerings built under India’s ambitious MAKE IN INDIA (MII) project. 

Continue Reading

Edu News

Roots Collegium signs World Chess Champion Koneru Humpy as Brand Ambassador

Chairman, Sri. B.P.Padala announces Koneru Humpy as the Brand Ambassador of Roots Collegium educational institutions

EP Staff

Published

on

  • Humpy will be the face of its upcoming brand-related undertakings

Celebrating its 30 years’ legacy in imparting holistic education, Roots Collegium, a well-known educational institution in Hyderabad has today announced World Chess Champion Koneru Humpy as its Brand Ambassador. The appointment of Ms. Koneru Humpy is set to boost the brand image of Roots Collegium amplifying the institution’s philosophy in making its students achieve global exposure.

With the signing up of world chess champion Koneru Humpy as its Brand Ambassador, Roots Collegium will be exploring untapped areas of new-age innovations and add them to its list of achievements in the last 30 years. The fast expanding Roots Collegium, started in 1991, offers intermediate courses and almost all the streams of Bachelor’s degrees. Roots collegium is providing a wide range of courses like BBA, BBA (Business Analytics, B.Com (General, computers, sales), BA (Mass Communication, Psychology, and modern languages), and B.Sc. (Data Analysis). The college provides a variety of courses in design, film and media, visual arts, hotel management, culinary arts, and also many other certificate courses.

Reacting to her appointment, Koneru Humpy said “It is my pleasure to be the brand ambassador of Roots Collegium. I thank Sri. BP Padala gaaru for the honour. Roots Collegium as an institution has been offering the best educational facilities for its students for the last 30 years. Roots and I share a similar journey as we both have started our journey’s 30 years back. We both have same passion, integrity and ethics. That’s the special part about Roots Collegium, to which I instantly related to. And now we are ready to travel together.  As an ambassador, I look forward to contributing my committed services to the college and helping them grow in stature. I will take part in each and every event of the college and would like to share my thoughts and way-forward ideas in all its initiatives. Once again I want to thank all the staff and management for choosing me as a brand ambassador.”

Koneru Humpy is the FIDE Women’s rapid chess champion of 2019. She became the youngest woman to achieve the title of a Grandmaster back in 2002 at the age of 15. Her association will be an added feather to the cap of Roots Collegium’s growing achievements and a historic moment in its journey. 

The Chairman of Roots Collegium, Sri. B P Padala said “It’s our honour to have such a young and dynamic chess champion, Koneru Humpy as our brand ambassador. I wish the students of Roots Collegium will be motivated by such an inspiring champion who has come across a lot to be in this position today. I have no doubt that with her determination and strong belief, she become a world champion. I hope every student will inculcate the habit of not giving up and fighting for what they want, from her. She will be a role model for each and every student of our institution. I thank Koneru Humpy for accepting this position and people like her will definitely help bring change in society.”

The students of Roots Collegium from different places in Hyderabad city are excited and look forward to listen to the life-changing stories from Koneru Humpy in the future. 

Continue Reading

Trending