8 hours 10 minutes
Hi, I'm Matthew Clark, and this is lesson 3.4 I o t. Ecosystems
in this lesson will look at the three layer architectures will discuss the security controls and at the perception layer communication layer and application layer.
And lastly, we'll explore fog computing. So let's get started
in less than 3.3. We looked at coyote architectures mainly from a data perspective.
We divided the architecture in two stages, which included things, data, acquisition, edge, I, T and Cloud Services.
Each stage process data in a certain way and then moved it along a path.
And based on that data command and control was passed back down. In the other direction,
there's no defined or accepted i o t architecture template to use. You're going to see the same information displayed many different ways, depending on just different factors, like who's displaying it, who the audience is and, quite frankly, what the i. O. T. Application as itself. In the examples in this lesson,
we're going to divide architecture and the layers
because we're focusing on the entire ecosystem, not just one device within it.
And so these layers will be the perception layer, which will include the I O T device, the actuator, the sensor,
the network layer, or gateway
in the application layer, or which includes back in systems.
So let's start with the perception layer.
Perceptually includes the physical layer of their architectures. The sensors actuators the tags and how those interact with the environment,
either to gather data to make changes or to inform the overall process
how they communicate with the network. Layer devices, which are typically gateways.
And this could be personal area networks, local area networks or wide area network.
Those connective ity requirements really will dictate the type of communication protocol that will be used.
And remember that the device hardware itself could be limited so the processor in a sensor can be limited. Its memory amounts could be limited. Of course, battery life is always a consideration,
and things to think about.
So let's talk about the network layer.
This includes routers and gateways.
The job of this layer is to connect and communicate between the physical layer and the application layer.
This is where measurements were collected, where digital to serums are serial. The digital conversion occurs. Pre processing of data would occur at this level.
Also, this layer provides protocol translation artificial intelligence, pre processing of data.
One of the typical functions of this layer really ist act is just the middle man.
The application layer is really where the cloud resides. It communicates with the gateway typically over wired or cellular Internet. This could be a W s Google Cloud or your company's data center
on contains the physical servers and logical databases that are required to maintain the type of application processing that that's needed.
Data storage is gonna happen here. Big data filtering, really monitoring and alerting is gonna happen here along with business logic, big data sets and processing and large analytics.
Um, including third Party, A P I. S. All those types of things are going to reside here, this application layer
and so this type of three layer ecosystem architecture. Er it's simple and easy to understand, right? It's great for back of the napkin discussions, but it's also great if you needed a white board something and talk about a specific aspect of the architecture itself.
So let's talk about perspective, especially as it relates to data and making decisions.
So how is cloud computing different than computing at the device or the gateway level.
Well, the device and gateway level. We have limited data. You know, we have a way, have a lot of data devices generate a lot of data, but it's it's limited in that it on Lee is about that particular device of our small period of time.
And maybe that's OK, right for some decision. Making
a cloud computing weaken generally make better long term decisions because it's based on multiple data sets, multiple sensors and the data itself has been pre processed, so it's been aggregated and so that the data amount of data that you have is more and it covers a longer period of time.
So let's use the example of being at the beach and looking at the horizon.
So a swimmer in the water and this is a great picture because it kind of makes that high levels right. There is a swimmer
if the swimmers in the water, they have very limited distance to their horizon, and so it's kind of next to impossible to see a small boat. You know that's not that far away, you know, half a mile away. Be really hard to see that boat.
But for a person on an average height of 5 ft seven inch person
standing at sea level, standing on the shore,
um, the horizon is a is a lot greater. Verizon moves to about five miles, so if you're in the water, you can only see about a half mile. If you're standing on the shore, you can see about five miles.
I live in North Carolina and North Carolina has a lot of different lighthouses. There's one lighthouse I Lovett's Cape Hatteras.
It's 187 ft tall. So if you stood on the top of Cape Hatteras Lighthouse,
then your horizon is gonna be a lot more different, right? You're going to see about 33 miles,
Um, out. And so you can. You can use that same three step architecture er a to perception layer. You know, you could see about a half a mile out at the network layer where the gateways are. You know, maybe you can see about five miles out. And the data center We're talking about 33 miles, right? Or more.
And so it's just another way to think about this.
Well, in the last section of this lesson. We'll talk about fog computing,
and there's two terms edge computing and fog computing. And there's somebody competing terms
edge and fog computing kind of are used in i o T architecture designs
and move services closer to the end device that were traditionally performed a little closer to the cloud
on. So we're talking about pre processing the data or real time services, moving those closer to the device and, by extension, the end user.
There's lots of different uses for this additional I o t manufacturing applications. Smart city, Smart, critical infrastructure, you know, even in home coyote device ecosystems.
Let's see. So if I was to try to define both of them, I would completely messed it up. So let me paraphrase Paul Butterworth. He's the co founder and CEO of antique. Hopefully, I said the name of his company correctly,
Um, and I'll try to get as close to his definition is possible that I read.
But edge computing is really closely related to devices that have attached sensors or gateways that are really have really close proximity to sensors itself and fog computing moves. That activity that would occur really on the gateway or the device out to the land. Um,
and because it's out in the land,
it may not have as close of a proximity to the sensors. But it's not all the way out to the Internet.
Onda again, the idea here, um, if we're used an example of on a burglar alarm last time we used to fire alarm. But let's let's say that someone's trying toe break into our house. Um,
you wouldn't want that sensor to detect like the glass was broken or the door was opened,
um, and that to send that that information all the way through the ecosystem all the way to the cloud where it gets queued up and eventually it's processed. And then alarms triggered information that sent all the way back down, um, to the IOC system.
And it goes to an actuator and turns on the alarm, right or calls the police
or something. In those cases, you want that activity to happen is close to the source is absolutely possible. And that's where edge computing, which is really close, comes into play or fog computing, which is still close, closer than the Internet but really on a on a on a land. Somewhere.
In this lesson, we used a simple layering process to identify risk. We discussed security controls at the perception layer communication layer and application layer,
and lastly, we explored fog computing.