Application layer attacks – Examining Security and Privacy in IoT

Application layer attacks target the software and services that run on IoT devices or the cloud services that manage them. Attackers could exploit vulnerabilities in the software or firmware running on the device to gain control over it or access sensitive data. Attackers could also launch attacks such as SQL injection or cross-site scripting (XSS) attacks on the web applications used to manage the devices.

IoT networks face a wide range of attacks, and each layer of the network presents different vulnerabilities. IoT security must be implemented at each layer of the network to mitigate the risks associated with these attacks. The use of encryption, authentication, and access controls can help to secure physical devices and the data transmitted between them. Regular updates and patches should be applied to the software and firmware running on the devices to address any known vulnerabilities. Overall, a layered security approach that considers the entire IoT ecosystem can provide a more robust defense against attacks.

We can see different forms of attacks on embedded IoT systems in Figure 11.2:

Figure 11.2 – Different attacks on embedded systems

The diagram provides a structured view of potential vulnerabilities an embedded system may face, categorizing them based on the method or perspective of the attack. It categorizes the different attacks into three main types: Software-based, Network-based, and Side-based, described as follows:

Software-based attacks:

  • Malware: Malicious software intended to damage or exploit an embedded system
  • Brute-forcing access: A method of trial and error whereby an attacker attempts to guess the correct access credentials
  • Memory-buffer overflow: A situation where a program writes data outside the bounds of pre-allocated fixed-length buffers, leading to potential code execution or system crashes

Network-based attacks:

  • MITM: An attack where the attacker secretly relays and possibly alters the communication between two parties who believe they are communicating directly with each other
  • Domain Name System (DNS) poisoning: An attack where the attacker redirects DNS entries to a malicious site
  • DDOS: An attempt to disrupt the regular functioning of a network by flooding it with excessive traffic
  • Session hijacking: When an attacker takes over a user’s session to gain unauthorized access to a system
  • Signal jamming: An interference with the signal frequencies that an embedded system might use, rendering it inoperable or reducing its efficiency

Side-based attacks:

  • Power analysis: Observing the power consumption of a device to extract information
  • Timing attacks: Analyzing the time taken to execute cryptographic algorithms to find vulnerabilities
  • Electromagnetic analysis: Using the electromagnetic emissions of a device to infer data or operations

With that understanding, we can now look at how cloud providers such as Amazon Web Services (AWS) provide powerful tools to manage security on the platform.

Security and privacy controls within the cloud management landscape – Examining Security and Privacy in IoT

As more and more IoT devices are connected to the internet, cloud management has become an essential component of IoT networks. The cloud provides a scalable, flexible, and cost-effective solution for storing and processing the vast amounts of data generated by IoT devices. However, with the benefits of the cloud also come security and privacy concerns.

This section will discuss security and privacy controls that are necessary within the cloud management landscape to ensure the safe and effective operation of IoT networks. We will explore key security and privacy considerations in the cloud, including data encryption, identity and access management (IAM), network security, and compliance with regulatory requirements.

Types of attacks

IoT networks face numerous threats that come from various sources. Attackers could target physical devices, communication channels, or the cloud services that manage the devices. Each layer of the IoT network presents a different vulnerability, and attackers have different techniques for exploiting each layer.

Physical layer attacks

Physical attacks on IoT devices involve gaining access to the devices through direct manipulation. Attackers could physically connect to the device’s ports, such as USB or Ethernet ports, and install malicious firmware or software to take control of the device. Attackers could also use side-channel attacks to obtain sensitive information from the device’s hardware or firmware, such as encryption keys or other authentication data.

Data link layer attacks

Data link layer (DLL) attacks involve intercepting or manipulating communication between IoT devices and the network. Attackers could use techniques such as packet sniffing or man-in-the-middle (MitM) attacks to capture and modify data being transmitted between devices. Attackers could also use spoofing attacks to impersonate legitimate devices or gateways to gain access to the network.

Network layer attacks

Network layer attacks focus on disrupting the network infrastructure that connects IoT devices. Attackers could launch DDoS attacks to overload the network with traffic, causing it to become unresponsive. Attackers could also exploit vulnerabilities in the routing protocols used by IoT networks to redirect or manipulate data traffic.

Challenges within security on IoT networks – Examining Security and Privacy in IoT

The increasing number of connected devices in IoT networks has raised several security concerns. These concerns include the following:

Lack of encryption: Many IoT devices do not have proper encryption protocols in place, making them vulnerable to attacks that can compromise user data and personal information.

Weak authentication and authorization: IoT devices often use weak passwords or default credentials, making them susceptible to brute-force attacks. Additionally, many IoT devices do not implement proper authentication and authorization mechanisms, allowing unauthorized access to sensitive data.

Inadequate software updates and patching: IoT devices may not have proper mechanisms for software updates and patching, making them vulnerable to known vulnerabilities and exploits.

Lack of standardization: There is a lack of standardization in IoT devices, making it difficult for manufacturers to provide security updates and for security researchers to identify vulnerabilities.

Physical security: IoT devices may be easily physically accessible, making them vulnerable to physical attacks and tampering.

Malware and botnets: IoT devices can be infected with malware and used as part of a botnet to launch distributed denial-of-service (DDoS) attacks and other malicious activities.

Privacy concerns: IoT devices often collect and store sensitive data, raising privacy concerns if the data is not properly secured.

Lack of awareness: Users may not be aware of the security risks associated with IoT devices and may not take appropriate measures to secure their devices and networks.

After seeing the different challenges, we can now take a look at some recommendations for remediating them properly.

Security recommendations

To enhance the security of IoT networks, it’s essential to integrate both general security practices and the specific guidelines outlined by industrial standard architectures such as Matter, Thread, Zigbee, MQTT, and Wi-SUN. These standards provide well-rounded security mechanisms tailored for IoT environments. The following recommendations align with these standards:

Secure communication: IoT devices must utilize secure communication protocols such as HTTPS, TLS, or SSL, which are integral to standards such as MQTT and Wi-SUN. These protocols encrypt data transmitted between devices and servers, ensuring adherence to industry benchmarks for secure communication.

Access control: Strong authentication and authorization mechanisms should be implemented as per the guidelines of these standards. This ensures that only authorized devices or users gain access to the IoT network, aligning with the security protocols of Matter and Zigbee.

Regular software updates: Consistent updating of IoT devices with the latest security patches and firmware is crucial. This practice aligns with the maintenance protocols recommended by these standards, ensuring devices remain safeguarded against evolving threats.

Data encryption: Encryption of stored and transmitted data is a core aspect of these standards. By encrypting data, IoT devices comply with industry practices, ensuring robust protection against unauthorized access or interception.

Privacy protection: Designing IoT devices to protect user privacy is a fundamental aspect of these standards. This involves limiting the collection of personal data and providing transparent privacy policies, in line with the privacy guidelines of standards such as Thread and Matter.

Physical security: Implementing physical security measures such as tamper-proofing and anti-theft mechanisms is crucial. These measures are often outlined in the security protocols of these standards, ensuring a comprehensive approach to physical security in IoT environments.

Monitoring and analytics: Real-time monitoring and analytics are essential for detecting and responding to security incidents. This practice is often emphasized in these standards, promoting proactive security management in IoT networks.

Vendor security assessment: Conducting a thorough security assessment of IoT devices before integration is crucial. This assessment should ensure that the devices comply with the required security standards, aligning with the industry benchmarks set by these architectures.

By implementing these security recommendations, organizations can reduce the risk of security breaches and protect their IoT networks from malicious attacks. However, it is important to note that security is an ongoing process and must be regularly reviewed and updated to address emerging threats and vulnerabilities.

With that, we’ve gained a better understanding of the current state of security within IoT environments, including challenges and solutions for it. Now, we can take a look at how it is implemented alongside controls within the cloud environment.

Technical requirements – Examining Security and Privacy in IoT

From smart homes to connected cars, IoT devices have become ubiquitous in our daily lives. However, with increased connectivity and data exchange comes the risk of security and privacy breaches. As more and more sensitive information is transmitted through these devices, it is essential to examine security and privacy measures in place to protect both the users and the devices themselves.

In this chapter, we will explore various security and privacy concerns in IoT, including risks associated with data breaches and strategies used to mitigate them. We will also discuss the importance of privacy in IoT and how it is protected, as well as the challenges of implementing security measures in a rapidly evolving technological landscape. By examining these critical issues, we can gain a better understanding of the measures needed to ensure the security and privacy of IoT devices and networks.

In this chapter, we’re going to cover the following main topics:

The current state of risk and security within IoT

Security and privacy controls within the cloud management landscape

Risk management within the IoT landscape

Privacy and compliance within IoT networks

Cryptography controls for the IoT landscape

Practical – Creating a secure smart lock system

Technical requirements

This chapter will require you to have the following hardware and software installed:

Hardware:

  • Raspberry Pi
  • Single-channel relay
  • 1 1k Ohm resistor
  • Push button switch
  • Jumper cables
  • Breadboard
  • Mobile charger 5V/1A to use as power supply

Software:

  • Blynk app
  • Arduino IDE

You can access the GitHub folder for the code that is used in this chapter at https://github.com/PacktPublishing/IoT-Made-Easy-for-Beginners/tree/main/Chapter11/.

The current state of risk and security within IoT

As IoT technology continues to evolve and expand into new areas of our lives, it is critical that we understand the current state of risk and security within IoT networks. In this section, we will explore the current landscape of IoT security, including the most common types of IoT security threats and the current state of IoT security standards and regulations. We will also discuss best practices for securing IoT networks and devices, as well as challenges and opportunities for improving IoT security in the future. We can start off by taking a look at how security encompasses IoT in Figure 11.1:

Figure 11.1 – Overview of how security encompasses IoT

Figure 11.1 presents a structured overview of the current state of risk and security within IoT. The diagram is segmented into four main columns, representing distinct aspects of IoT: Device, Communications, Cloud platform and services, and Use Cases.

The diagram emphasizes the diverse facets of IoT, spanning from device-level hardware to broad use cases. It shows expansive areas where security is paramount in the IoT ecosystem, from individual devices and their communication pathways to the cloud platforms that store and process data, and finally, the real-world applications and sectors that implement IoT solutions.

We can continue the discussion by taking a look at challenges within security on IoT networks.

Monitoring ingested data – Working with Data and Analytics

Monitoring is an important component of analyzing IoT data, and we can do this as detailed next:

You can check messages that have been ingested into your channel through the AWS IoT Analytics console. Within the console, on the left pane, click on Channel and choose the name of the channel that we created.

On the page, scroll down to the Monitoring section. Adjust the time frame that you currently want to be displayed as needed by choosing one of the existing time frame indicators. You will then see a graph line that shows the number of messages that were ingested into the channel during that period.

You can also check for pipeline activity executions. Follow the same workflow process you went through by clicking Pipelines, followed by the name of the pipeline that was created on the console, and adjust the time frame indicators. You will see a graph line that shows the number of pipeline activity execution errors in that period.

Now that we can monitor the data, let’s look at creating a dataset from our data.

Creating a dataset from the data

We can now look at creating a dataset from the data we have. Proceed as follows:

Navigate to Datasets from the sidebar and click on the Create button.

Name your dataset mydataset.

For the action, use a SQL expression such as select * from mydatastore.

Complete the dataset creation by clicking Create.

To generate dataset content, select the dataset and click on the Run Now button.

To view the content, click on the dataset name and navigate to its content. You should see the results and can even download them if they’re available.

With that, we have been able to create an end-to-end pipeline! Note that much of this infrastructure is still based solely on AWS. In real-world deployments, we would see more interconnectivity between AWS and on-premises equipment and see real-time data analysis being performed based on the data ingested.

As always, feel free to look at the documentation for all the hardware and software involved in this practical. We encourage you to explore further data analysis options, particularly in the transformation process when creating an ETL Glue job.

Summary

In this chapter, we have covered the fundamentals of data analytics within IoT workloads, discussing how different services within AWS can handle analysis and storage loads that are required as part of our IoT cloud workflow. We then learned more about how we can design and develop the implementation of our architecture according to our use case and learned how to practically use the offerings from AWS IoT Analytics to provision an end-to-end data pipeline.

In the next chapter, we will be discussing security and privacy within IoT, which is an imperative topic to talk about, given how it has become even more prevalent as more and more people have shifted their workloads into the cloud.

Further reading

For more information on what was covered in this chapter, please refer to the following links:

Understand more on data lakes and analytics solutions provided by AWS: https://aws.amazon.com/big-data/datalakes-and-analytics/

Review more IoT Analytics success cases provided by AWS: https://aws.amazon.com/iot-analytics/

Look at another reference architecture for an AWS serverless data analytics pipeline: https://aws.amazon.com/blogs/big-data/aws-serverless-data-analytics-pipeline-reference-architecture/

Take a look at an implementation of real-time monitoring of industrial machines that utilizes AWS IoT: https://ieeexplore-ieee-org.ezproxy.library.wisc.edu/document/10016452/

Look at how a connected factory was able to leverage its offerings based on AWS IoT: https://aws.amazon.com/blogs/iot/connected-factory-offering-based-on-aws-iot-for-industry-4-0-success/

Explore more on architecting industrial IoT workloads based on the cloud: https://us.nttdata.com/en/blog/2022/september/architecting-cloud-industrial-iot-workloads-part-2

Ingesting data – Working with Data and Analytics

We can now send some data through the pipeline. At this time, we will create some mock data to send through. To simulate data ingestion from the AWS console, you might need to utilize another AWS service such as Lambda to push data to IoT Analytics or use the AWS SDK; so, we will use the following steps:

Navigate to the AWS Lambda service in the AWS console.

Click the Create function button.

Choose Author from scratch. Provide a name for the Lambda function (for example, PushTemperatureDataToIoTAnalytics).

For the runtime, select a preferred language. For this walkthrough, we’ll assume Python 3.8.

Under Permissions, choose or create a role that has permissions to write to IoT Analytics.

Click Create function.

Write the Lambda function, as shown next. This code mocks temperature data and pushes it to AWS IoT Analytics:
import json
import boto3
import random
client = boto3.client(‘iotanalytics’)
def lambda_handler(event, context):
    # Mocking temperature data
    temperature_data = {
        “temperature”: random.randint(15, 35) }
    response = client.batch_put_message(
        channelName=’mychannel’,
        messages=[
            { ‘messageId’: str(random.randint(1, 1000000)),
                ‘payload’: json.dumps(temperature_data).encode()}, ])
    return {‘statusCode’: 200,
        ‘body’: json.dumps(‘Temperature data pushed successfully!’)}

The function creates a client to interact with AWS IoT Analytics using boto3. Inside the function, it generates mock temperature data, where the temperature value is a random integer between 15 and 35 degrees. This data is structured in a dictionary as can be seen in a snapshot of the AWS Lambda window below.

Figure 10.9 – AWS Lambda window for inserting code

Then, it sends this temperature data to a channel named mychannel in AWS IoT Analytics using the batch_put_message method. The message includes a unique ID, generated randomly, and the payload, which is the serialized temperature data. The function concludes by returning a success status code (200) and a message indicating the successful push of the temperature data.

Deploy the function after inserting the code.

At the top right of the Lambda function dashboard, click the Test button.

Configure a new test event. The actual event data doesn’t matter in this context since our function isn’t using it, so you can use the default template provided.

Name the test event and save it.

With the test event selected, click Test. If everything’s set up correctly, you should see an execution result indicating success, and your AWS IoT Analytics channel (mychannel in this example) should have received a temperature data point.

Having properly ingested the data, let’s now see how to monitor it.

Creating a channel – Working with Data and Analytics

Let’s create our channel as part of the analytics workflow:

If you have not already, sign in to the AWS console. Afterward, navigate to the IoT Analytics console. For your convenience, here is a link to the console: https://console.aws.amazon.com/iotanalytics/.

In the IoT Analytics dashboard, click on Channels on the sidebar and click the Create channel button.

Provide a name for the channel (for example, mychannel) and follow through with the default settings. For the storage type, pick Service managed storage. Click on Create at the bottom to finish creating the channel.

You can view the created channel by navigating to the Channels section from the sidebar:

Figure 10.6 – Channel created in the IoT Analytics list of channels

With the channel created, we can now create a data store.

Creating a data store

Creating a data store is necessary to store data that has been put through the pipeline. We will walk through its creation here:

We can add multiple data stores, but for the purposes of this exercise, we will use a single data store. In the IoT Analytics dashboard, click on Data stores on the sidebar and click the Create data store button.

Choose Service managed storage as the storage type:

Figure 10.7 – Configuring the storage type used for the data store

Click Create to finalize the data store creation.

To view created data stores, go to the Datastore section from the sidebar.

With the data store created, we can now create a pipeline.

Creating a pipeline

A pipeline consumes messages that come from a channel and allows you to process and filter them before storing them within the data store. Here are the steps to create a pipeline:

In the IoT Analytics dashboard, click on Pipelines on the sidebar and click the Create pipeline button. It should then take you to the following screen:

Figure 10.8 – Configuring the pipeline ID and sources for the pipeline

Provide a name for the pipeline (for example, mypipeline) and follow through with the default settings. For Pipeline source, pick your newly created channel, and for Pipeline output, pick your newly created data source.

For Enrich, transform, and filter messages, pick Select attributes from the message.

Click on Create at the bottom to finish creating the channel.

As with previous steps, we can then check whether the pipeline was created successfully by viewing it on the Pipelines page of the IoT Analytics dashboard.

Having created a pipeline, we can now start ingesting some data through it.

A case study for data analytics – Working with Data and Analytics

Now that we have seen use cases and have learned about how we can evaluate IoT deployments that leverage data analytics services on AWS, let’s take a look at how one industrial environment can utilize the AWS environment to perform data analytics workloads and the workflow behind it. We can see this case represented in Figure 10.4:

Figure 10.4 – AWS data analysis within an industrial environment

In this workflow, we can see that the industrial environment is pushing data onto AWS Greengrass, which in turn uses the IoT MQTT protocol to deliver data to AWS IoT Core. It will then in turn put through data to AWS IoT Analytics to be further visualized via QuickSight. On the other hand, if an IoT rule is triggered, it will instead feed the data to Amazon SNS, where the operations team will be notified through an alert. Additionally, data can also be fed in by moving the on-premises database onto the cloud with Database Migration Service (DMS), which is a service used for migrating databases onto AWS. It can then be ingested using Amazon Kinesis Data Streams and processed using AWS Lambda, where it then will be fed into AWS IoT Analytics.

Now that we’ve become more familiar with these workflows for data analytics, let’s get on to our practical.

Practical – creating a data pipeline for end-to-end data ingestion and analysis

In this practical, we will look to create a data pipeline based on the AWS console. This will follow the architecture shown in the following diagram:

Figure 10.5 – Data pipeline workflow for data ingestion

We will have a device send data to a channel. The channel will receive the data and send it through the pipeline, which will pipe the data through to the data store. From the data store, we can then make SQL queries to create a dataset from which we will read the data.

We can now go ahead and start off by creating a channel.

Industrial data analytics – Working with Data and Analytics

We have seen the usage of data analytics in the past two sections and how it can be beneficial for our workloads. Now, let’s look at how it can benefit industry cases and how we can accordingly evaluate our deployments based on the best practices that are set out for us.

Evaluating performance

Use services such as CloudWatch metrics to monitor the performance of the IoT Analytics pipeline, such as the number of messages processed, the time it takes to process each message, and the number of errors that are encountered. This will be critical for use in further analysis and eventual optimization. The following are factors to consider in evaluating performance:

Analyze your data: We can use IoT Analytics SQL or other data analytics tools to identify any patterns or issues that we may need to address if they affect system performance.

Optimize your pipeline: From the analysis of the data, we can optimize the pipeline by adding data normalization, validation, and modeling to improve the performance of the data analytics workloads.

Use best practices: We need to adhere to best practices for data analysis, which includes techniques such as normalization, data validation, and data modeling. For the scope of this book, we will not be covering this, but we encourage you to look up more of these techniques in the Further reading section and read up on the topics listed there.

Usage of third-party monitoring tools: We can utilize third-party monitoring tools to collect and analyze performance metrics for our analytics workload and gain more insights into how our pipeline is performing.

Monitor and track usage of resources: We need to keep an eye on resources such as CPU, memory, and storage that are used by our data analytics workloads, especially if they are consuming more resources than expected. If necessary, we should perform actions such as scaling our workloads up or optimizing the pipelines further.

Having understood how to keep track of performance, we can now review some different use cases of data analysis within industry.

Use cases within industry

Industry has many different use cases for performing data analysis on a myriad of data. Here are just a few prominent examples:

Predictive maintenance: Within this use case, IoT devices are used to collect real-time sensor data that is processed and analyzed using AWS IoT Analytics to detect patterns and accordingly predict when maintenance would be required. This will help organizations schedule maintenance at the required times, reducing downtime and improving the efficiency of equipment.

Smart agriculture: IoT sensors can be used to collect data on soil moisture and temperature, which is then analyzed within AWS IoT Analytics to optimize crop yields, reduce consumption of water, and improve overall farm efficiency.

Smart cities: IoT devices can be used to collect data on various aspects of urban infrastructure such as traffic, air quality, and energy usage. The data can then be analyzed through AWS IoT Analytics where it can then be used to improve traffic flow, reduce pollution, and optimize energy usage to ensure that cities become more sustainable and livable for their residents.

With those use cases in mind, we can now take a look at a case study of a data analytics flow used within a production environment in an industrial setting.

Practical – smart home insights with AWS IoT Analytics – Working with Data and Analytics-2

Click on the target S3 bucket on the canvas and select the format as Parquet. Specify the new Amazon S3 bucket you created (for example, s3://your_bucket_name).

Go to the Job details tab and specify the IAM role you have been using so far. Leave everything else as it is. Rename the script filename to anything you want, as long as it ends with .py (for example, test.py):

Figure 10.3 – Configuring job details for the Glue job

Click Save, and afterward, click Run.

With that, we have appropriately transformed the data as needed.

Use Amazon Athena to query the transformed data.

We can now look at leveraging Amazon Athena to query the data that we have transformed:

  1. Navigate to the Amazon Athena service.
  2. On the sidebar, click on Query editor.
  3. There should be a prompt asking you to select an output location for your queries. Specify an S3 bucket or a folder within a bucket to do so.
  4. In the Athena dashboard, select AWSDataCatalog as the data source and SmartHomeData database (or your predefined values for them).
  5. Run the following query by clicking Run:


Select * from mychannelbucket
6. You should get the full table that you created before. Now, use SQL queries to answer the following questions:

  1. What is the average temperature, humidity, and light intensity for each day of the month?
  2. What is the average temperature, humidity, and light intensity for each hour of the day?
  3. What is the average temperature, humidity, and light intensity for each day of the week?
  4. What is the correlation between temperature and humidity and between temperature and light intensity?
  1. View the query results and save the query results to a new S3 bucket.
    In this practical exercise, we explored IoT data analytics using AWS services such as S3, Glue, and Athena. We loaded a dataset of IoT sensor readings into an S3 bucket, used Glue to transform the data and create a new table with additional columns, used Athena to query the transformed data and generate insights, and used QuickSight to visualize the insights and create a dashboard. Based on the insights generated, we provided recommendations for improving the smart home experience.
    We will now move on to industrial data analytics.