Research Journal of Computer Systems and Engineering https://technicaljournals.org/RJCSE/index.php/journal <h2>Research Journal of Computer Systems and Engineering (RJCSE)</h2> <div id="content" style="font-family: Arial; text-align: justify;"><img style="float: left; max-width: 25%; padding: 0 10px 0 0;" src="https://technicaljournals.org/RJCSE/public/site/images/admin_agser/book-cover.jpg" alt="" width="100%" height="auto" /> <div><strong>e-ISSN:2230-8571 p-ISSN: 2230-8563 </strong><strong> | Frequency Bi-Annual </strong>(2 Issue Per Year)<strong> | Nature: </strong>Online<strong> | Language of Publication: </strong>English<strong> | Article Processing Charges: </strong>None (Free of cost)<strong> | Publisher: Vishwakarma Institute of Information Technology</strong></div> <div> </div> <div id="journalDescription"> <p>This journal is devoted to theoretical developments in computer systems science and their applications in computer systems engineering. The journal covers the intense research activity that is being carried out in the systems field in both theoretical and practical hardware and software problems. Specifically, the journal is soliciting high quality, original technical papers addressing research challenges in large scale database systems, supercomputing, artificial intelligence, software engineering, multimedia and visualization, computer networks, computer and network security, programming languages, testing and verification of classical and non-classical computer systems, amongst others. Original research papers as well as state-of-the-art reviews and technical notes are published regularly. Research notes, new development experience and application papers are an important part of the journal's all-round coverage of the subject; industrial developments and new products are also monitored. A conference calendar, reviews of new books and reports of important meetings from around the world keep readers fully informed.</p> <p><strong>Journal Scope</strong></p> <p><strong>1. Computer Architecture and Systems:</strong> Including but not limited to processor design, memory systems, parallel and distributed computing, embedded systems, and hardware-software co-design.</p> <p><strong>2. Software Engineering and Programming Languages:</strong> Covering topics such as software design methodologies, software quality assurance, programming language theory and implementation, software maintenance and evolution, and formal methods.</p> <p><strong>3. Networks and Communications:</strong> Encompassing research on network protocols, wireless and mobile communication systems, network security and privacy, internet of things (IoT), and network performance analysis.</p> <p><strong>4. Data Science and Big Data:</strong> Focusing on data mining, machine learning, artificial intelligence, big data analytics, data visualization, and applications of data science in various domains.</p> <p><strong>5. Cybersecurity and Privacy:</strong> Addressing issues related to cybersecurity threats, intrusion detection and prevention, cryptographic techniques, privacy-enhancing technologies, and secure systems design.</p> <p><strong>6. Human-Computer Interaction:</strong> Including research on user interface design, usability evaluation, interaction techniques, augmented and virtual reality, and user experience (UX) design.</p> <p><strong>7. Robotics and Autonomous Systems:</strong> Covering topics such as robot kinematics and dynamics, control algorithms, robot perception, human-robot interaction, swarm robotics, and autonomous vehicle technologies.</p> <p><strong>8. Computer Vision and Pattern Recognition:</strong> Exploring algorithms and techniques for image and video processing, object detection and recognition, pattern analysis, and computer vision applications.</p> <p><strong>9. Cloud Computing and Internet Technologies:</strong> Addressing research on cloud infrastructure, resource management, virtualization, edge computing, fog computing, and emerging internet technologies.</p> <p><strong>10. Emerging Technologies and Innovations:</strong> Providing a platform for research on emerging trends and innovations in computer systems and engineering, including quantum computing, neuromorphic computing, bioinformatics, and nanotechnology applications.</p> <p>RJCSE welcomes contributions from researchers, academics, engineers, and practitioners worldwide, aiming to foster collaboration and exchange of ideas to address the challenges and opportunities in the rapidly evolving field of computer systems and engineering. The journal encourages interdisciplinary research that integrates concepts, methodologies, and technologies from various domains to address complex real-world problems and drive technological innovation.</p> </div> </div> Auricle Global Society of Education and Research en-US Research Journal of Computer Systems and Engineering 2230-8563 Implementation of Captcha Mechanisms using Deep Learning to Prevent Automated Bot Attacks https://technicaljournals.org/RJCSE/index.php/journal/article/view/70 <p>Online platforms' integrity and security are seriously threatened by the growth of automated bot attacks, necessitating the development of effective methods for telling harmful bots apart from legitimate users. In order to successfully combat automated bot attacks, this project investigates the creation and application of CAPTCHA (Completely Automated Public Turing test to tell Computers and Humans Apart) methods utilising deep learning techniques. In this study, we employ deep learning to create and apply complex CAPTCHA-problems that help us distinguish between real users and automated bots. Convolutional neural networks and recurrent neural networks are employed to build dynamic, adjustable CAPTCHAs in order to keep up with the bots' evolving approaches. Our research focuses on creating CAPTCHAs that are simple to use for humans but difficult to understand by robots, which creates a favourable user experience for those who are actually human. Character skewing, background noise injection, and image obfuscation are some of the methods we use to safeguard our CAPTCHAs while maintaining their usability. Furthermore, we carry out exhaustive trials in the real world to assess the effectiveness of our methods based on in-depth CAPTCHA learning. We evaluate their resistance to a variety of attack methods, such as counterattacks and machine learning-based bot attacks, in order to confirm that they are resilient. The findings of our study show that utilising deep learning in CAPTCHA methods to thwart automated bot attacks is both feasible and effective. Our method makes the internet environment safer and more user-friendly while also enhancing security and reducing user annoyance. This work is an important step in the fight against the growing menace of automated bot attacks in the digital sphere.</p> <p>&nbsp;</p> Sachin R. Sakhare Vivek D. Patil Copyright (c) 2024 2023-12-31 2023-12-31 4 2 01 15 10.52710/rjcse.70 Edge-Based Real-Time Sensor Data Processing for Anomaly Detection in Industrial IoT Applications https://technicaljournals.org/RJCSE/index.php/journal/article/view/71 <p>The Industrial Internet of Things (IIoT), which uses devices with sensors to provide real-time insights into crucial processes, has completely changed how industries function. However, there are many problems associated with the sheer volume and speed of data created in industrial environments, particularly when it comes to anomaly detection. The development of edge-based real-time sensor data processing techniques was required because traditional cloud-based solutions frequently experience latency problems and privacy issues. This study suggests a novel method for IIoT applications that focuses on processing sensor data at the edge, close to the data source, for anomaly identification. We offer real-time analysis of sensor data without the need for continuous data transfer to the cloud by utilising the processing capabilities of edge devices, such as industrial gateways and embedded systems. To find anomalies in streams of real-time sensor data, our methodology integrates data pre-processing, feature engineering, and machine learning algorithms. This strategy not only lessens the strain on the network's bandwidth but also ensures quick reaction to urgent situations, cutting downtime and boosting operational effectiveness. Proposed system has adaptive learning features that enable it to continuously adjust to altering ambient factors and sensor properties, enhancing the precision of anomaly detection over time. We provide experimental findings that show how our edge-based anomaly detection system performs well in diverse industrial situations. The results show that, while protecting data privacy and minimising latency, our methodology outperforms conventional cloud-based methods in terms of anomaly detection performance.</p> Yogesh D. Deshpande S.R. Rahman Copyright (c) 2024 2023-12-31 2023-12-31 4 2 16 30 10.52710/rjcse.71 Precision Agriculture through Deep Learning Algorithms for Accurate Diagnosis and Continuous Monitoring of Plant Diseases https://technicaljournals.org/RJCSE/index.php/journal/article/view/72 <p>For sustainable food production, precision agriculture is essential, and one of its main tenets is the precise identification and ongoing surveillance of plant diseases. Conventional approaches to illness monitoring and detection are frequently labour-intensive, time-consuming, and dependent on visual inspection, which increases the risk of misidentifying diseases. Deep learning algorithms have surfaced as a potentially effective way to tackle these issues. In this study, we introduce a novel method for precision agriculture that improves plant disease diagnostic accuracy and offers continuous monitoring by utilising deep learning algorithms. Our research uses cutting-edge convolutional neural networks (CNNs) and ResNet50 to precisely identify illness symptoms in plant photos. The proposed deep learning model is trained on an extensive dataset of plant photos illustrating a range of illnesses, enabling it to identify minute visual cues that human observers might overlook. Compared to previous ML methods, the model's accuracy in detecting diseases is higher, which lowers the possibility of misdiagnosis and facilitates early intervention to minimise crop damage. By placing cameras and sensors in the fields, proposed system provides continuous monitoring in addition to precise diagnosis. The proposed deep learning model processes the real-time data and photos of the crops that are captured by these devices.</p> Parvaneh Basaligheh Ritika Dhabliya Copyright (c) 2024 2023-12-31 2023-12-31 4 2 31 45 10.52710/rjcse.72 Design and Implementation of a Fog Computing Architecture for IoT Data Analytics https://technicaljournals.org/RJCSE/index.php/journal/article/view/73 <p>The number of Internet of Things (IoT) devices has exponentially increased, creating an explosion of data that requires sophisticated processing and analysis techniques. When it comes to meeting the demands of long duration and narrow band of the things' Internet applications, traditional cloud computing solutions may encounter difficulties. To overcome these issues, fog computing has developed into a workable concept for extending nube services all the way to the system's edge. The development of a fog computing architecture for the analysis of data from the internet of things is the topic of this study. Our system's three primary parts are fog nodes, edge devices, and a central cloud server. Sensors and edge devices of the Internet of Things (IoT) are in charge of local preprocessing and data collection. In between network edge devices and the cloud server, fog nodes act as intermediaries. Their actions have reduced the volume of raw data sent to the cloud for processing and archiving. One cloud server manages all aspects of data analysis, storage, and archiving. In order to show how effective and efficient our architecture is, Our approach was supported by data gathered from a variety of Internet-connected devices, and by lowering the amount of data transferred to the cloud, we were able to considerably lower lag and the use of black band. The network's core fog nodes also offered the processing capacity required to carry out analysis relatively instantly. The advantages of the board devices, given that a large number of Internet of Things applications require real-time or almost real-time data processing, this architecture stands out because to its capacity to lower latency, save bandwidth, and improve system efficiency.</p> Parikshit Mahalle Sheetal S. Patil Copyright (c) 2024 2023-12-31 2023-12-31 4 2 46 59 10.52710/rjcse.73 Implementation of Long Short-Term Memory (LSTM) Networks for Stock Price Prediction https://technicaljournals.org/RJCSE/index.php/journal/article/view/74 <p>In this research, we explore the potential of Long Short-Term Memory (LSTM) networks for predicting stock prices. Due to the complexities of the financial markets and the inherent volatility of stock prices, accurate forecasting is now crucial for investors and financial specialists. It has been shown that LSTM, a type of recurrent neural network (RNN), can recognise temporal correlations and patterns in serial data. Training and assessing LSTM models in this work involves analysing stock price data, relevant financial measures, and market sentiment indicators. We looked into other ideas, hyper parameters, and preprocessing methods to see if we might boost the networks' performance. To further improve the model's generalizability, we utilise series normalisation and removal to reduce overfitting. The outcomes demonstrate that the LSTM network outperforms more standard series temporal prediction methods in capturing and anticipating shifts in action pricing. We also conduct extensive back testing and evaluation, using measures like mean squared error (MSE) and mean absolute error (MAE), to assess the model's accuracy and resilience. The results of this study shed light on how deep learning techniques, in particular LSTM networks, can be applied to the prediction of stock prices, potentially assisting traders, investors, and other financial decision-makers in navigating complex and volatile financial markets.</p> Vivek Deshpande Copyright (c) 2024 2023-12-31 2023-12-31 4 2 60 72 10.52710/rjcse.74 Transfer Learning Strategies for Fine-Tuning Pretrained Convolutional Neural Networks in Medical Imaging https://technicaljournals.org/RJCSE/index.php/journal/article/view/79 <p>In the area of medical imaging, transfer learning has become a potent technique that uses pretrained Convolutional Neural Networks (CNNs) to improve the performance of particular tasks. An overview of several transfer learning techniques used for optimising pretrained CNNs in the context of medical image analysis is given in this abstract. The size limitations of medical imaging datasets make it difficult to train deep learning models from scratch. Pre-trained CNNs are a good place to start, such as those that have been trained on huge natural picture datasets like ImageNet. When these pre-trained models are applied to medical imaging applications, fine-tuning is frequently used. One common method is feature extraction, where the bottom layers of the pretrained CNN are frozen and operate as feature extractors. Then, for the specific medical task at hand, these features are loaded into a bespoke classifier. The ability of the pretrained network to recognise subtle picture patterns is advantageous in this method. Another strategy is to optimise the CNN architecture as a whole, which enables the model to adjust to the features of medical images. Small learning rates are frequently used in transfer learning techniques to avoid overfitting during fine-tuning. Additionally, to further enhance model generalisation, domain-specific data augmentation is essential. The use of ensemble approaches, which combine several pretrained CNNs, is also investigated. These models are capable of offering various feature representations and improving classification precision. In order to bridge the domain gap between natural photos and medical images, domain adaption techniques are also used. One approach to align feature distributions is by adversarial training, while another is through domain-specific batch normalisation. The feature extraction, network fine-tuning, ensemble approaches, and domain adaptation are all part of transfer learning methodologies for optimising pretrained CNNs in medical imaging. Researchers have made great progress using these techniques in a number of medical image processing tasks, proving the value of transfer learning in this important area.</p> Muhamad Angriawan Copyright (c) 2024 2023-12-31 2023-12-31 4 2 73 88 10.52710/rjcse.79 Real-Time Emotion Recognition using Deep Facial Expression Analysis on Mobile Devices https://technicaljournals.org/RJCSE/index.php/journal/article/view/80 <p>Numerous applications in human-computer interaction, healthcare, and other fields have been made possible by the growth of mobile devices, which has created new opportunities for real-time emotion recognition. This research introduces a novel method for mobile device-based deep facial expression analysis for real-time emotion recognition. In order to execute accurate and efficient emotion recognition directly on the device, our technology makes use of the computational capacity of contemporary smartphones. This eliminates the need for cloud-based processing and ensures user privacy. For mobile platforms, we use a deep learning architecture that is optimised for both speed and accuracy. A lightweight convolutional neural network (CNN) for facial feature extraction and a recurrent neural network (RNN) for temporal emotion modelling are the main elements of our system. Our system accurately detects and categorises a variety of emotions, including joy, sadness, rage, surprise, and more by processing video frames from the device's camera feed in real-time. We carried out extensive trials on a variety of datasets to assess our methodology, attaining state-of-the-art accuracy with minimal processing cost. Through a variety of applications, including emotion-aware virtual assistants, mental health tracking tools, and immersive gaming experiences, we show how useful our technology is. This paper makes a contribution to the burgeoning field of mobile-based emotion recognition by providing a strong and effective solution that enables researchers and developers to produce ground-breaking software that can better comprehend and react to human emotions while protecting data privacy and guaranteeing real-time performance on mobile devices.</p> Ankur Gupta Sweta Batra Copyright (c) 2024 2023-12-31 2023-12-31 4 2 89 102 10.52710/rjcse.80 Implementation and Evaluation of Intrusion Detection Systems using Machine Learning Classifiers on Network Traffic Data https://technicaljournals.org/RJCSE/index.php/journal/article/view/81 <p>Strong Intrusion Detection Systems (IDS) are now essential given how much more crucial services and communication are being reliant on digital networks. Through the use of machine learning classifiers on network traffic data, this research shows the deployment and thorough evaluation of IDS. The first step of the study is to gather and preprocess a wide dataset of network traffic, which includes both legitimate and criminal operations. A high-dimensional feature set is produced when important information is extracted from the raw data using feature engineering techniques. In order to simulate the patterns of network traffic, a variety of machine learning methods are used, such as Decision Trees, Random Forests, Support Vector Machines, and Neural Networks. The models are also put to the test in a variety of situations, such as those with changing levels of network traffic, different kinds of attacks, and false-positive rates. Results show that machine learning-based IDS is more accurate than conventional rule-based systems at identifying and categorising network intrusions. Assessments are made of the models' ability to scale up and change to accommodate new threats. A thorough examination of IDS utilising machine learning classifiers on actual network traffic data is provided in this research, which, in turn, advances network security. The results highlight the value of machine learning in improving the precision and sturdiness of intrusion detection systems and protecting crucial network infrastructures from new cyber threats.</p> Romi Morzelona Riddhi R. Mirajkar Copyright (c) 2024 2023-12-31 2023-12-31 4 2 103 116 10.52710/rjcse.81 Real-Time Malware Detection on IoT Devices using Behavior-Based Analysis and Neural Networks https://technicaljournals.org/RJCSE/index.php/journal/article/view/82 <p>IoT devices' constrained processing capabilities and malware's changing nature make traditional signature-based methods for malware detection ineffective. The focus of our suggested approach, in contrast, is on real-time analysis of IoT device behaviour patterns to find anomalies that might be signs of malicious activity. We can spot differences from typical behaviour on devices by continuously observing how they behave. These differences could indicate the existence of malware. We use deep neural networks to handle and analyse the enormous quantity of data produced by IoT devices in order to do this. Specifically, we use recurrent neural networks (RNNs) and convolutional neural networks (CNNs). These neural networks learn the anticipated behaviours of various IoT devices and their applications through training on historical data. They quickly detect unexpected behaviours that can be a sign of malware infestations or other harmful actions by comparing incoming data streams to these learned patterns in real-time. By reaching high detection rates while preserving low false-positive rates, our experimental results show the efficiency of the suggested approach. We can greatly improve the security posture of IoT devices or gateways by integrating this real-time malware detection technology into them, defending against new attacks in the ever-changing IoT landscape. By protecting the privacy and integrity of IoT-enabled environments, our research will help to mitigate the escalating cybersecurity challenges faced by IoT devices.</p> Amruta V. Pandit Dipannita Mondal Copyright (c) 2024 2023-12-31 2023-12-31 4 2 117 129 10.52710/rjcse.82 Smart Home Automation using IoT: Prototyping and Integration of Home Devices https://technicaljournals.org/RJCSE/index.php/journal/article/view/83 <p>Smart homes are becoming a reality thanks to the Internet of Things (IoT) technology's quick progress, providing homeowners with never-before-seen levels of efficiency, convenience, and security. This abstract gives a summary of a project that combines prototyping and integration to turn a standard house into a modern smart home. The paper focuses on creating an extensive IoT ecosystem that seamlessly connects diverse household equipment, such as lighting controls, thermostats, and security cameras, to a single, intelligent network. We use cutting-edge IoT communication and protocol technologies to create strong connectivity between these devices, enabling homeowners to remotely monitor and control them from a central hub or their smartphones. Our smart home system's central controller, which includes cutting-edge sensors and AI algorithms, is its brain. To reduce energy usage, improve security, and accommodate resident preferences and routines, this controller gathers and examines data from the linked devices. It learns user patterns through machine learning and suggests automation procedures to make life easier. A smart home system prioritises security and privacy in addition to increasing convenience. We also place a strong emphasis on flexibility and scalability, which enable the incorporation of new hardware and features as the IoT landscape changes. For homeowners, tech enthusiasts, and developers interested in building their own smart houses, our project on prototyping and integration serves as a model. It highlights how IoT technology has the ability to upgrade conventional living quarters into intelligent, networked areas that improve our quality of life while fostering energy efficiency and security. In the end, this project advances the continued development of smart home automation by increasing everyone's access to and ability to customise it.</p> Malkeet Singh Anishkumar Dhablia Copyright (c) 2024 2023-12-31 2023-12-31 4 2 130 143 10.52710/rjcse.83 IoT-Based Health Monitoring System: Design, Implementation, and Performance Evaluation https://technicaljournals.org/RJCSE/index.php/journal/article/view/84 <p>The Internet of Things (IoT) technology's rapid improvements have opened the door for creative solutions across a range of industries, including healthcare. An IoT-based health monitoring system that aims to revolutionise patient care and healthcare administration is described in this study along with its design, implementation, and performance evaluation. To continuously gather and send health-related data, our system makes use of a network of wearable sensors and gadgets that are seamlessly incorporated into a patient's daily life. Vital indicators like heart rate, blood pressure, temperature, and activity levels are tracked by these devices. To enable real-time analysis and storage, the data is safely sent to a centralised server. Both patients and healthcare professionals can access this information through a user-friendly smartphone application, enabling proactive healthcare decision-making. An effective and scalable architecture is used in the implementation of this system to guarantee the confidentiality, accuracy, and reliability of the data. The data is analysed using machine learning algorithms, which enables the early identification of abnormalities and trends that could portend serious health problems. The system can also produce warnings and notifications, ensuring prompt intervention when it's necessary. Our IoT-based health monitoring system's performance review indicates how well it performs in terms of enhancing healthcare outcomes. The solution gives healthcare professionals immediate access to crucial health information, allowing them to personalise treatment regimens, offer remote consultations, and make educated judgements. Continuous monitoring benefits patients by allowing for early intervention, fewer hospital stays, and an improvement in general health. Additionally, the system's scalability and versatility make it appropriate for a variety of healthcare settings, including small-scale home care and extensive hospital networks.</p> Nouby M. Ghazaly Naveen Jain Copyright (c) 2024 2023-12-31 2023-12-31 4 2 144 159 10.52710/rjcse.84 Edge-Enabled Smart Traffic Management System: An IoT Implementation for Urban Mobility https://technicaljournals.org/RJCSE/index.php/journal/article/view/85 <p>Effective traffic management has emerged as a critical issue in today's rapidly urbanising areas with a rise in the number of cars. By developing an Edge-Enabled Smart Traffic Management System (EESTMS) run on the Internet of Things (IoT), this paper puts forth a novel approach. EESTMS makes use of edge computing and IoT technology's promise to improve urban transportation. An large network of thoughtfully placed sensors and cameras dispersed around the city forms the system's central structure. These gadgets continuously gather data on the volume, speed, and congestion of moving vehicles. This information offers insightful information about traffic trends. We can lessen latency and lighten the load on centralised systems by processing this data at the edge. In order to analyse the data and identify traffic bottlenecks and congestion hotspots, machine learning techniques are used. Real-time analysis allows for dynamic traffic signal adjustments, which optimise traffic flow and shorten commuter travel times. EESTMS also offers a user-friendly interface with real-time traffic information, alternate routes, and tailored navigation advice that is accessible via mobile applications and web platforms. By making wise decisions, commuters can lessen their stress and carbon footprint. EESTMS plays a critical role in advancing sustainability by reducing fuel consumption and greenhouse gas emissions through effective traffic management, in addition to enhancing urban mobility. By giving emergency vehicles priority routing, this system also improves emergency response times. The application of EESTMS has shown promising outcomes in terms of lessened traffic congestion, improved commuter experiences, and lower environmental impact. Innovative solutions like EESTMS can open the door for smarter, more sustainable urban mobility as cities continue to grow, eventually enhancing citizens' quality of life.</p> A. Kingsly Jabakumar Copyright (c) 2024 2023-12-31 2023-12-31 4 2 160 173 10.52710/rjcse.85 Real-Time Water Quality Monitoring in Aquaculture using IoT Sensors and Cloud-Based Analytics https://technicaljournals.org/RJCSE/index.php/journal/article/view/86 <p>The health and productivity of aquatic creatures in aquaculture systems depend on maintaining ideal water quality conditions. The production of seafood worldwide is significantly influenced by aquaculture. Conventional monitoring techniques can entail lengthy processes, are rare, and lack the promptness necessary to prevent and alleviate unfavourable circumstances. This study proposes a novel method for monitoring water quality in real time in aquaculture, utilising cloud-based analytics and Internet of Things (IoT) sensors. Within aquaculture facilities, strategically positioned Internet of Things (IoT) sensors continuously collect information on key water quality factors, such as water and air temperature, light intensity, humidity, pH levels, wind speed, and ammonia nitrogen content. A cloud-based analytics platform receives real-time data from these sensors, processes it using cutting-edge algorithms, and then analyses the data. This holistic approach offers a variety of advantages. Real-time monitoring enables aquaculturists to quickly spot deviations from ideal conditions, lowering the danger of disease outbreaks and aquatic species mortality. To optimise farming practises and resource allocation, historical data gathered in the cloud is used to build predictive models. Aquaculturists may monitor and manage their systems remotely, which improves operational efficiency and lessens the need for on-site staff. This is a major benefit. The system can also send warnings and alarms in the event of anomalous circumstances, ensuring quick reactions to urgent situations.</p> Nadica Stojanovic Sunita Chaudhary Copyright (c) 2024 2023-12-31 2023-12-31 4 2 174 187 10.52710/rjcse.86 Implementation of Wearable IoT Devices for Continuous Physiological Monitoring and Analysis https://technicaljournals.org/RJCSE/index.php/journal/article/view/87 <p>The development of wearable devices for ongoing physiological monitoring and analysis has been made possible by the emergence of the Internet of Things (IoT), which has completely changed the healthcare industry. This study discusses the use of such wearable IoT devices and their potential to improve healthcare delivery. Our research focuses on the development, manufacture, and implementation of wearable sensors capable of real-time monitoring of vital signs like heart rate, body temperature, blood pressure, and activity levels. These devices leverage the Internet of Things to transmit data wirelessly to a cloud service for further analysis. The core components of these gadgets are miniature sensors, low-power microcontrollers, and wireless communication modules. These developments permit the invisible and continuous collection of data, which aids in the early detection of irregularities and permits urgent medical treatments. Data collected by these devices is processed by sophisticated analytics and machine learning algorithms before being transmitted securely to a cloud-based server. As a result, doctors and nurses are able to anticipate health issues, monitor their patients' physical conditions in real time, and provide individualised care. Los patients have more control over their own medical decisions since they have access to their own data. Frequent use of Internet-connected devices for monitoring and analysing physical phenomena may lead to better medical care, lower hospitalisation costs, and higher quality of life. Additionally, these devices have the potential to revolutionise clinical research by providing extensive real-world data for medical research. These wearable IoT devices have the potential to enhance the management of chronic illnesses, encourage early treatments, and raise general wellbeing. Wearable physiological monitoring technology has a lot of potential for the future of healthcare thanks to the advancements of the internet of things.</p> Anasica S. Copyright (c) 2024 2023-12-31 2023-12-31 4 2 188 200 10.52710/rjcse.87 Exploring Feature Engineering Strategies for Improving Predictive Models in Data Science https://technicaljournals.org/RJCSE/index.php/journal/article/view/88 <p>A crucial step in the data science pipeline, feature engineering has a big impact on how well predictive models function. This study explores several feature engineering techniques and how they affect the robustness and accuracy of models. In order to extract useful information from unprocessed data and improve the prediction capability of machine learning models, we study a variety of techniques, from straightforward transformations to cutting-edge approaches. The study starts by investigating basic methods including data scaling, one-hot encoding, and handling missing values. Then, we go on to more complex techniques like feature selection, dimensionality reduction, and interaction term creation. We also explore the possibilities for domain-specific feature engineering, which entails designing features specifically for the issue domain and utilising additional data sources to expand the feature space. We run extensive experiments on numerous datasets including different sectors, such as healthcare, finance, and natural language processing, in order to evaluate the efficacy of these methodologies. We evaluate model performance using metrics like recall, accuracy, precision, and F1-score to get a comprehensive picture of how feature engineering affects various predictive tasks. This study also assesses the computational expense related to each feature engineering technique, taking scalability and efficiency in practical applications into account. To assist practitioners in making wise choices during feature engineering, we address the trade-offs between model complexity and performance enhancements. Our results highlight the importance of feature engineering in data science and demonstrate how it may significantly improve prediction models in a variety of fields. This study is a useful tool for data scientists because it emphasises the significance of careful feature engineering as a foundation for creating reliable and accurate prediction models.</p> Ekaterina Katya Copyright (c) 2024 2023-12-31 2023-12-31 4 2 201 215 10.52710/rjcse.88 A Detailed Study on IIR-FIR Filters and Design of a Graphical User Interface for Simulation of EEG Signals https://technicaljournals.org/RJCSE/index.php/journal/article/view/89 <p>Electroencephalography (EEG) plays a pivotal role in accepting brain activity and identifying neurological disorders. The precise analysis and explanation of EEG signals require the solicitation of digital signal processing techniques, such as Infinite Impulse Response (IIR) as well as Finite IJmpulse Response (FIR) filters. This paper investigates into a complete exploration of IIR-FIR filters and their application in EEG signal processing. The first part of this research paper involves a detailed examination of IIR and FIR filter types, their mathematical foundations, pros, and cons. Various filter design methods, such as Chebyshev, Butterworth and Elliptic filters, are discussed as well as compared through literature survey. Speculative aspects, including filter design, transfer functions, and frequency responses, are presented in a clear and much accessible manner. The second phase of the study introduces the design of a Graphical User Interface (GUI) on mathematical modelling tool aimed at enabling EEG signal simulation and analysis. This GUI is designed to enable users, including researchers and clinicians, to generate synthetic EEG signals with controllable parameters, apply IIR-FIR filters in real-time, and thereby visualize the filtered signals. The interface offers user-friendly controls for customizing filter characteristics, such as filter order, cutoff frequencies, and filter type. To validate the efficiency of the designed GUI and the selected IIR-FIR filters, general simulations are conducted using EEG datasets. The results showcase the GUI's efficacy in real-time EEG signal processing, demonstrating its prospective in research, clinical diagnostics, and educational settings and many areas. In summary, this paper presents a full investigation into IIR-FIR filters, proposing insights into their theory and practical application for EEG signal processing. The development of a intelligible GUI for EEG signal simulation and study further enhances the approachability of these mathematical modelling tools to a wider audience, eventually contributing to progressions in the field of neuroscience and brain signal analysis.</p> Dipannita Debasish Mondal Mukil Alagirisamy Copyright (c) 2024 2023-12-31 2023-12-31 4 2 216 225 10.52710/rjcse.89