IT student awarded for proposing safer healthcare model

A proposal on a model for a safe and secured intelligent healthcare system from cyberattacks has earned a Malaysian Indian IT student an award in Australia.

Arawinkumaar Selvakkumar, 23, received the Best Paper Awards from the Queensland University of Technology (QUT) School of Computer Science for his research paper titled “Addressing the Challenges and Impact of Adversarial Machine Learning Attacks on Smart Healthcare Systems”.

“Since the outbreak of the COVID-19 pandemic, I develop a deep interest in the healthcare industry and how it can adapt to meet the overwhelming demand for its services.

“After spending countless hours watching tutorial videos from pyimage search and reading more than 30 research papers about artificial intelligence and smart healthcare, I came out with a standard that can assist the health system,” said Arawinkumaar.

He said several vulnerabilities and adversarial attacks make it challenging for a safe and secure intelligent healthcare system from a security point of view. 

“Machine learning has been used widely to develop suitable models to predict and mitigate attacks. 

“Still, the attacks could trick the machine learning models and misclassify diseases and symptoms.  

“As a result, it leads to incorrect decisions, for example, false disease detection and wrong treatment plans for patients,” said Arawinkumaar, who successfully completed his Master of Information Technology majoring in Cyber Security and Networks at QUT in Brisbane in December.

His paper underlined the type of adversarial attacks and their impact on smart healthcare systems and proposed a model to examine how such attacks impact machine learning classifiers. 

Arawinkumaar used a medical image dataset to test the model that can classify medical images with high accuracy. 

“We then attacked the model with a Fast Gradient Method Sign attack (FGSM) to cause the model to predict the images and misclassify them inaccurately. 

“Using transfer learning, we train a VGG-19 model with the medical dataset and later implement the FGSM to the Convolutional Neural Network (CNN) to examine the significant impact it causes on the performance and accuracy of the machine learning model. 

“Our results demonstrate that the adversarial attack misclassifies the images, causing the model’s accuracy rate to drop from 88% to 11%,” said Arawinkumaar, a former international student ambassador for the City of Brisbane.

He said he was invited to share his thoughts on his award-winning paper at the 14th International Conference on Sensing Technology (ICST 2022) in Chennai from 17th-19th January.

Due to travel restrictions, he gave a recorded presentation on his discoveries at the conference.

His paper titled “Addressing adversarial machine learning attacks in smart health care perspectives” will highlight the defensive approaches and how hospitals can use intelligent healthcare systems to meet demands and solve health system challenges.

Arawinkumaar who is waiting to enter the IT workforce is also working on his paper to be published as a journal by his university.

Leave a Reply

Your email address will not be published.