The Machine INtelligence Development (MIND) Lab at the American University of Beirut (AUB) aims at providing state of the art solutions in the field of Artificial Intelligence with a focus on machine learning theory and applications. The lab has recently established a new identity, grew in size, and saw great progress in cutting-edge research in two main domains: Natural Language Processing and Context-Aware Sensing.
As we begin a new year, we are extremely proud of the diligence and perseverance that our lab has shown despite the challenges that oversaw the world and Lebanon the past year. Our students were able to push cutting-edge research, produce publications, compete in local and global challenges, and share their expertise with the world through open-source solutions and invited talks.
2020 Research Manifesto
Natural Language Processing (NLP)
Our NLP focused team has explored a wide range of challenges, from conversational systems and Arabic as a low-resource language to studying human reading comprehension.
Gilbert Badaro, our lab’s newest Ph.D. graduate, has investigated multiple link prediction approaches and different natural language processing variations in order to try to link a large lexical Arabic Resource, SAMA, to English WordNet. In his work, he used a machine learning approach and developed training data using existing small-scale Arabic WordNet. His latest article is important because it provides publicly a large-scale sentiment lexical resource for Arabic, ArSenL 2.0, to enable more accurate sentiment mining models. It also presents a benchmark dataset to be used by other researchers in the field interested in automatically evaluating approaches for Arabic WordNet expansion.
Obeida ElJundi has focused on exploring human reading comprehension. Reading comprehension is not the result of one single process as was thought before the 1970s. Rather, psychologists realized that a combination of several complex cognitive processes is involved. Traditional NLP algorithms used nowadays (e.g., RNN) cover some aspects of the reading comprehension cognitive processes, such as attention and forgetting, but render other aspects unaccounted for. Obeida has been looking into studying how the human brain reads and comprehends written text will help reveal what is missing and improve the overall performance of NLP.
Wissam Antoun has recently defended his thesis on Arabic Transformers for NLU and chatbots. This year, he trained the first Arabic BERT model, AraBERT, alongside his colleague, Fady Baly. Wissam is now planning to explore newer and better models for language understanding and generation, and he is off to a great start! Wissam and Fady have started 2021 by releasing 10 new Arabic Transformer Models (AraGPT2, AraELECTRA & AraBERTv2) on AUB Mind Lab and Hugging Face Github repositories.
Nataly Dalal has worked on designing lifelong learning for chatbots in customer support, thus allowing them to continuously learn and accumulate knowledge on their own from support forums posted on the web. Lifelong learning aims at producing chatbots that are more accurate, robust, and human-like.
Christian Hokayem has focused the bulk of his research on conversational automated negotiation agents. With the goal of improving user experience in an automated sales setting, he investigated what empathy is and tried to find a way to incorporate it into agents’ language generation and decision-making models. In his work, he formulated the problem as a multi-objective reinforcement learning problem and taught the bot to maximize user experience (represented by sentiment) and sales price.
Tarek Naous, our newest master’s student, has started with research focused on enabling empathetic behavior in conversational agents for the Arabic Language with the global aim of making human-machine interactions more similar to human-human interactions. In 2020, he was able to release a dataset of empathetic conversations in Arabic and propose a neural-based model for generating empathetic responses. His work was published in WANLP 2020 and is currently in progress for further development, where the challenge is improving performance given the limited dataset size we have.
Context-Aware Sensing
With a focus on overcoming the challenge of learning from limited labeled data, our work in context-aware sensing has been versatile, ranging from applications of healthcare, such as traumatic brain injuries and epilepsy prediction, to power load forecasting and human activity recognition.
Reem Mahmoud’s research interest is in personalized Machine Learning, which branches into problems of learning with little labeled data and advancing traditional transfer learning methods. This past year, she has looked into overcoming challenges of catastrophic forgetting in pre-trained neural networks on time-series applications, such as human activity recognition. Her work looked into improving performance on target tasks with limited training data while preserving performance on source tasks from the pre-trained model. This work, titled “Multi-Objective Learning to Overcome Catastrophic Forgetting in Time-series Applications,” has been recently submitted to the journal of ACM Transactions on Knowledge Discovery from Data.
Marc Djandji has focused his research for the past year on improving the performance of deep learning models when trained with limited labeled training examples. He has specifically investigated the problem under the application of short-term load forecasting. His work looks into scalable multi-task learning methods.
This year, Alaa Khaddaj had completed an internship at Signal Kinetics (MIT Media Lab) under the supervision of Prof. Fadel Adib, where they developed an apparatus that can detect contaminated food or counterfeit products. The system used a generative model (variational autoencoder) to generalize to new unseen environments and a transfer learning scheme to learn from multiple experiments simultaneously. Alaa’s other line of research was to develop a domain adaptation model that can generalize a classifier between two similar domains that have some differences. For his final year project, his team developed an intelligent question-answering system for customer support. This work has been submitted as a journal paper.
Mosbah Aouad has worked on developing deep learning models for short-term load forecasting for residential houses, where the challenge is to deal with the high non-linearities present in the load data. To build a sensitive model to varying load patterns and large peaks in the data, he designed an attention mechanism to augment a sequence-to-sequence network to capture these variations. On the other side, his thesis work has focused on biomedical image analysis with the aim of predicting the survival rate of penetrating traumatic brain injury (pTBI) patients from brain CT scans analysis. He is designing a representation learning approach that captures relevant features reflecting the severity of pTBIs from CT scan analysis directly.
Open-source Solutions, Competitions & Other Activities
Beyond the efforts in research, we are proud of the efforts our team has put into sharing and pushing their expertise through participating in competitions, open-sourcing solutions, and delivering talks.
Here are some of our highlights from 2020:
-
- We launched our AUB MIND Lab’s Github Repository and a brand new website for the group.
- Amir Hussein, Rim Achour, and Fady Baly ranked 1st place at QICC Cybersecurity contest in Fake News Detection task.
- Alaa Khaddaj won Best Paper Award at IEEE ICIoT 2020.
- Obeida ElJundi and his work on hULMonA were featured in WIRED Middle East
- AraBERT by our alumnus, Wissam Antoun, and Fady Baly has been featured in MIT Technology Review Arabia
- Marc Djandji, Fady Baly, and Wissam Antoun ranked 2nd Place at 4th Workshop on Open-Source Arabic Corpora and Processing Tools (OSACT 4) in Hate and Offensive speech detection tasks – Marseille, France 2020
- Wissam Antoun invited by IWAN Research Group at King Saud University to deliver a talk titled “AraBERT: Pretraining, fine-tuning, applications and demo (In Arabic),” September 2020.
- Wissam Antoun invited by PyData Khobar Meetup to deliver a talk titled “AraBERT: Pretraining, use cases and future improvements,” May 2020.
2020 Graduates
-
- Dalia Jaber – M.E. in Electrical & Computer Engineering
- Wissam Antoun – M.E. in Electrical & Computer Engineering
- Amir Hussein – M.E. in Electrical & Computer Engineering
- Alaa Khaddaj – B.E. in Electrical & Computer Engineering
- Gilbert Badaro – Ph.D. in Electrical & Computer Engineering
- Raslan Kain – M.E. in Electrical & Computer Engineering
2020 List of Publications
Journal Papers
-
- R. Mahmoud, H. Hajj, F. Karameh, “A Systematic Approach to Multi-Task Learning from Time-series Data” Elsevier Applied Soft Computing, Vol. 96 (2020), 106586.
- G. Badaro, H. Hajj, N. Habash, “A Link Prediction Approach for Accurately Mapping a Large-Scale Arabic Lexical Resource to English WordNet,” ACM Transactions on Asian and Low-Resource Language Information Processing (TALLIP), 19, no. 6 (2020): 1-38.
- N. Abbas, S. Sharafeddine, H. Hajj, and Z. Dawy, “Price-Aware Traffic Splitting in D2D HetNets with Cost-Energy-QoE Tradeoffs,” Computer Networks, vol. 172, p. 107169, 2020.
- A. Hussein, M. Djandji, R. Mahmoud, M. Dhabi, H. Hajj, “Augmenting Deep Learning with Adversarial Training for Robust Prediction of Epilepsy Seizures” ACM Transactions on Computing for Healthcare, June 2020, Article 18.
- S. Taleb, A. Yeretzian, R. A. Jabr, and H. Hajj. “Optimization of building form to reduce incident solar radiation.” Journal of Building Engineering 28 (2020): 101025.
Conference Papers
-
- W. Antoun, F. Baly, H. Hajj, “AraBERT: Transformer-based Model for Arabic Language Understanding.” In Proceedings of the 4th Workshop on Open-Source Arabic Corpora and Processing Tools, with a Shared Task on Offensive Language Detection, pp. 9-15. 2020.
- T. Naous, C. Hokayem, H. Hajj, “Empathy-driven Arabic Conversational Chatbot,” Proceedings of the Fifth Arabic Natural Language Processing Workshop, December 2020, Barcelona, Spain (Online), pp 5-68.
- S. Taleb, A. Yeretzian, R. Jabr, H. Hajj, “Energy Optimization of Climate Adapted Buildings in an Urban Context: The Case of Subtropical Climate”, 35th PLEA Conference: Planning Post Carbon Cities, 1-3 September 2020 A Coruña, Spain.
- M. Djandji, F. Baly, W. Antoun, H. Hajj. “Multi-Task Learning using AraBert for Offensive Language Detection.” In Proceedings of the 4th Workshop on Open-Source Arabic Corpora and Processing Tools, with a Shared Task on Offensive Language Detection, pp. 97-101. 2020.
- O. ElJundi, M. Dhaybi, K. Mokadam, H. Hajj, and D. Asmar, “Resources and End-to-End Neural Network Models for Arabic Image Captioning,” In Proceedings of the 15th International Joint Conference on Computer Vision, Imaging and Computer Graphics Theory and Applications – Volume 5: VISAPP, 233-241, 2020, Valletta, Malta.
- A. Khaddaj, H. Hajj, “Representation Learning for Adversarial Domain Adaptation in Text Classification,” In Proceedings of the IEEE International Conference on Informatics, IoT, and Enabling Technologies (ICIoT’20), Qatar, Feb 2-5, 2020. [Winner of best paper award]
- A. Maarouf, N. El Droubi, R. Morcel, H. Hajj, M. Saghir, and H. Akkary. “Optimized Distribution of an Accelerated Convolutional Neural Network across Multiple FPGAs.” In 2020 IEEE 28th Annual International Symposium on Field-Programmable Custom Computing Machines (FCCM), pp. 235-235. IEEE, 2020.
- W. Antoun, F. Baly, R. Achour, A. Hussein, H. Hajj, “State of the Art Models for Fake News Detection Tasks,” In Proceedings of the IEEE International Conference on Informatics, IoT, and Enabling Technologies (ICIoT’20), Qatar, Feb 2-5, 2020.
To discover our lab and potential opportunities, visit our site or stay up to date with our news on LinkedIn and Twitter.
Official Website to Watch Sub format English Subtitles For Free Online at miruro
In 2020, AUB MIND Lab achieved breakthroughs in artificial intelligence, reshaping user experiences. Innovations like streamlined platforms and tools reflect cutting-edge advancements. Bloxstrap’s ease in downloading and integration aligns perfectly with such technological leaps for seamless user engagement.https://getbloxstrap.app/download/
Really a great addition. I have read this marvelous post. Thanks for sharing information about it. I really like that. Thanks so lot for your convene. Lintas88
I was surfing the Internet for information and came across your blog. I am impressed by the information you have on this blog. It shows how well you understand this subject. Lintas88
This is such a great resource that you are providing and you give it away for free. I love seeing blog that understand the value. Im glad to have found this post as its such an interesting one! I am always on the lookout for quality posts and articles so i suppose im lucky to have found this! I hope you will be adding more in the future… Lintas88
Thank you for taking the time to publish this information very useful! Herpesyl
Thank you for taking the time to publish this information very useful! Heart Freedom
I found your this post while searching for some related information on blog search…Its a good post..keep posting and update the information. Gold Card TV
I really appreciate the kind of topics you post here. Thanks for sharing us a great information that is actually helpful. Good day! The Genius Wave
Thanks for keeping this site updated with relevant information. เว บ หวย ยี่ กี
Thank you for taking the time to publish this information very useful! affordable startup seo
In 2020 AUB MIND Lab made significant strides in innovation fostering creativity and digital transformation just as the lab explores new technological horizons understanding How to digitize for hats has become an essential skill for modern design by leveraging digitization one can create intricate custom designs that elevate personalized products.