A while ago, I would have only known what the letters stand for and not what it actually is, and I don’t think I’m alone on this. However, the IoT is all around us and will continue to grow. I have been on a steep learning curve and thinking about this made me wonder about what we think we know and what are the common misconceptions about IoT? I have created my top five list of what I believe are the main misconceptions about IoT and here I will try and debunk them (in a non-techie way!) for you.
Wearables such as wristbands, smart fridges or thermostats are of course examples of IoT applications which are well known, ask most of the public for examples and this would be it. However, there are so many more examples of IoT in the business environment away from the more obvious ones. For example, within urban environments IoT can monitor air quality, provide adaptive traffic control and environmental monitoring. On construction sites, IoT can monitor PPE usage and provide predictive maintenance. IoT can be found in all industries, for more information drop me a line and I will provide you with further examples.
The reason many solutions are only implemented by bigger business is because they pay large consultancies to demonstrate the economics; find the right supplier and this should come as part of the package, therefore making it much more viable for a SME to evaluate and implement.
There are many solutions that are simple to install and have business impact that any businesses can deploy to start their IoT journey, for example:
Security has to be one of the key points and I would strongly advise anyone interested in IoT to talk with their security advisor about. It is most definitely not a straightforward one line answer, the reality is that almost any system has vulnerabilities, so it is about designing the solution so that these potential vulnerabilities become impotent.
For instance, it may be possible to intercept the data from a parking sensor, however by ensuring there is no pathway from this device back into the cloud, network security is maintained, and any potential data interception is entirely meaningless.
Security should be like an onion with many layers, each layer providing a different type and level of protection because hacking a system should not be as easy as knowing a password or IP address.
It is certainly possible to blow the budget on high end systems, however unless the system is being deployed for ecological benefit (which is priceless) the general purpose of using IoT in a busines environment is to generate savings, gains or increase output. All of which result in a return on investment and therefore are not an additional cost burden to a business. With Capex and Opex options commonly available, cashflow doesn’t need to be negatively impacted either.
The technology adoption curve dictates that as IoT becomes mainstream prices will decrease and therefore utilisation increases. A real-world example of technology adoption in industry is the car industry, where Formula 1 drives innovation that filters down into the cars that you and I drive at an affordable price.
People have always feared technical change, the most obvious example are the luddites, a secret oath-based organisation of textile workers in the 19th century, they were well known for destroying textile machinery in protest, which stemmed from the fear of losing their jobs to machinery. Jobs were lost but others grew and what was gained was a better quality of life, better health, and the creation of less harmful and dangerous jobs. There will be some job losses, Gartner predicts that 1.8M jobs will be eliminated but more positively 2.3M jobs will be created with AI. There will be new economies and new businesses developing. Our lives will be changed but we can’t replace humanity, for example would you be happy with a robot cutting your hair?
This is just the tip of the iceberg regarding misconceptions and questions around the IoT. It’s interesting to look beneath the fears and drill down on the possibilities of IoT.
We enjoy questions and would be more than happy to answer any concerns or dispel any myths that you may have heard around the IoT in a user-friendly non-techie way. Please contact us if you have queries, which are not in my top 5 and we will be happy to help you.
2020 was a record-breaking year, but some of the reasons went unnoticed because of preoccupation with the pandemic. 2020 had the sunniest April and May on record and the third-hottest day ever recorded in August. It also had the wettest February on record with the highest number of flood warnings ever issued, as well as the wettest single day ever in October. It was warm, wet and windy.
The Met Office predict ever hotter drier summers and wetter warmer winters. With sea-levels also rising we need proactive solutions quickly.
Surprisingly, the UK has no single body responsible for flood control. The Department for the Environment and Rural Affairs (DEFRA) is regarded as “the policy lead” in England. However, actual decisions depend on district and borough councils, the Cabinet Office, the Department for Communities, the Highways Authority, water and sewerage companies, independent Internal Drainage Boards, coastal protection agencies and a variety of other bodies. Even though water flows without regard to political boundaries, areas that fall inside Wales and Scotland have been delegated to their respective regional assemblies, further impeding a unified response.
Although DEFRA specifically advocates a “free flow of information” between all these agencies, it is inevitably fragmented. Knowing what is happening at any given moment is further complicated by pumps and sluices under the control of landowners, and construction activities that change the flow of surface and subterranean water. Widespread housing developments on flood plains also change the risks. What is missing is the ability to gather instant, highly detailed information on complete water systems.
Measuring water levels is not new: you can see old-style gauge boards beside many big rivers. DEFRA and other agencies rely on electronic measuring devices which are able to communicate with remote measuring stations. They provide an API which allows any interested party to access their data ( http://environment.data.gov.uk/flood-monitoring/doc/reference) but that data is limited.
The imminent water level in one location is often less determined by height readings in that location than by flow-rates upstream, plus other factors. Water momentum, wind direction, the saturation of flood plains and the quantity of rain falling in any particular location are all hard to measure. Air humidity, temperature, soil condition, the water table, plant growth on riverbanks and debris in sewers all contribute to creating a complex picture.
Current systems don’t capture enough information, but they are also limited by how the data is used. Most water monitoring systems do little more than trigger an alert if the river level is high. This isn’t much better than a weather report telling you it is raining when you can look out of the window.
Alerts are often too inaccurate, imprecise or late for those affected to act. By using more sensors and better analysis we might achieve two things, earlier and more precise predictions and proactive flood prevention.
Some good news is the explosion in sensor devices that communicate via the Internet of Things. This means that more types of sensor can be deployed over larger areas cost-effectively. They require no independent wired or wireless networks and there is no reason for agencies to hold information silos. Local stakeholders can deploy and extract the information they require and easily share it with regional and national interests.
Many different kinds of sensor are available. For water level monitoring alone there are radar devices, electrical switches, gas bubbler pressure gauges, flowmeters, submersible pressure transducers. Especially interesting are ultrasonic devices such as the Wilsen sonic level [1].
IoT devices of this type use little power so they can function for years unattended, yet still connect with internet gateways 15km away, for example the LoRaWAN® system. This means it is feasible to deploy them over larger areas, such as tributaries, drainage ditches, fields, forests and culverts.
The other potential advance depends on AI. Not only can an AI-powered system extract better flood predictions from larger data sources, but it could also be applied to prevent flooding. By truly centralising all water management related information, AI can identify the systemic patterns that lead to loss of control over water. Solutions that could minimise flood disasters may be deployed far from the locations they devastate, replacing flood alerts with avoidance policies.
AI and IoT approaches to flood prevention are already being explored by Fujitsu in Japan, Google in Patna India, the Lassonde School of Engineering in Canada and in the town of Cary in North Carolina using the SAS Viya AI analytics platform and Microsoft Azure.[1] https://www.pepperl-fuchs.com/great_britain/en/WILSEN-system.htm
Artificial intelligence is the beginning of a revolution, but in one way it is just like every other revolution: It can be abused. Whether or not you already use any AI, you need to understand two things; that AI is cranking up the severity of security threats, but it can also offer improved security.
AI systems are fast and dynamic, meaning they learn from experience instead of relying on pre-programmed assumptions. AI-powered malware doesn’t require the hacker to know anything about you in advance. However, an AI-powered defence system needn’t depend on fixed definitions of who to trust and who not, or how they gain access. It can learn to recognise suspicious activity.
AI will power more advanced intrusion attempts into systems that are themselves more powerful. End users need to understand that the sophistication of AI-powered tools does not mean they are secure. For example, facial recognition systems powered by AI can potentially be spoofed by another AI, providing building access to criminals, or framing innocent people with forged video footage.
A report from Forrester “Using AI for Evil” says “mainstream AI-powered hacking is just a matter of time” and Ciaran Martin, of the National Cyber Security Centre, said it’s a matter of “when not if” [there will be a major attack on the UK].
Using “bot manipulation”, malware can use AI to adapt its appearance so that antivirus software doesn’t recognise it. It can also use AI to sample normal network activity and use it as camouflage, known as “generative networks”. When the target is itself an AI system, a malicious actor can feed “poisoned” data to the engine in order to bypass filters or simply cause damage. AI can also learn to impersonate a legitimate person or company in order to launch a social engineering attack.
The ability of AI to react quickly and adjust its responses as situations evolve also makes it ideal for defenders. An AI security system gives defenders the edge by providing early warnings and rapid incident response, so attack vectors can be closed down before any real harm can be done. Darktrace is one such tool.
Behaviour analytics is another important defence tool. Detecting unusual activity allows the AI to close access to key resources while a deeper examination is undertaken, for example, using Varonis. Mastercard’s director of cyber and intelligence solutions in South Africa, says AI is saving $20 billion per annum by detecting fraud in this way. Embedded malware code can be detected using a similar method.
AI-powered solutions also help by improving activity logging; centralising it in a single place and providing tools to zoom in on significant trails. The logs collected by Azure and other Cloud platforms provide a good basis for an effective SIEM system. These tools also enable you to create and evaluate your alert response workflows.
Once in the Cloud you have access to specialist security products and expertise that few enterprises can deliver in-house. Specialist companies constantly monitor the global situation to stay aware of threats emerging in particular sectors or locations. An ideal SIEM integrates this digital intelligence with your standard procedures such as logs, asset inventory, AI pattern detection and automated incident responses, and makes it easy to demonstrate your statutory compliance.
Unfortunately, we can’t wait for someone else to solve our cybercrime problems. The very people we should be able to trust to protect us, the NSA and GCHQ, created the EternalBlue tool used in recent ransomware attacks such as WannaCry, NotPetya and BadRabbit. They also left exploitable flaws in Windows and implanted backdoors into server and router firmware. Although this is similar to the warnings against Huawei, the NSA have placed similar backdoor access into products from Cisco, Juniper and Fortinet.
The problem with creating these weapons is that everyone else soon uses them; innocent companies are the victims. According to Wikileaks on 7th March, the CIA regularly listens in on Samsung televisions and iPhones and can take control of numerous IoT devices and car computers. When they do it, others will soon follow.
For businesses the goal is clear, keep spyware and vulnerabilities out of your software and hardware. That means taking a keen interest in where your IT products come from and investing in good security. There are limits to what is practical, but an integrated security system powered by AI is the best possible solution
5G has suffered bad press from both detractors and supporters. Spoof stories about it spreading coronavirus were soon dismissed, but banal predictions of refrigerators ordering milk and shoppers wearing headsets to receive advertising were even more likely to blunt our interest. 5G undoubtedly creates the groundwork for an enormous technical revolution but adjusting the central heating with our smartphone or watching B-movies in higher resolution is not the point. Manufacturing and logistics industries will lead the real 5G revolution.
Although the public 5G network will take some time to get up to speed, local area networks can implement true 5G more quickly. This will enable factories, ports, universities, farms and airports to have their own industrial IoT systems (IIOT) today. Numerous factories are already claiming the ‘first’ 5G production lines, including a Nokia factory in Oulu Finland, Worcester Bosch in the UK, Mercedes Benz in Sindelfingen Germany and General Motors in Michigan.
Speed is often mentioned as a key advantage of 5G, but it helps if we break down the meaning of ‘speed’. 5G radio waves don’t move more quickly than 4G ones, rather the entire system has been optimised for faster data transfer. 5G can reduce latency to as little as a millisecond, enabling machinery to respond to sensors almost instantly.
Consider how quickly a driverless car must respond in order to operate safely and you will understand the value of low latency. In a similar way, 5G will enable a whole new generation of robots and automated machinery to radically improve dexterity, quality control and safety. Ericsson’s vice-president Åsa Tamsons explains:
"With one millisecond latency, you can sense whether there is a deviation in the process before the tool even hits the blade and you can stop the machine before the error happens".
‘Edge’ responses in today’s driverless cars are achieved by mounting the control device directly on the vehicle. 5G cars will achieve similar response times but with all the benefits of environmental network connectivity too.
5G also has far broader channels so that more devices can be connected simultaneously. It is said that 5G will soon be able to connect a million devices per square kilometre. Imagine what an engineer could do with ten thousand eyes and ten thousand hands. All the extra data feeding into AI enabled machinery would provide a precise real-time grasp of complex distributed systems and emergent situations with many industrial applications.
Not all 5G systems need to be this fast, but a typical industrial 5G LAN will match a good Ethernet one. A huge disadvantage of Ethernet is the wires, they are expensive to install, prone to breakages and need regular maintenance. In contrast, once setup a wireless 5G system is easy to maintain and reliable (99.9999% or ‘six nines’ reliability).
One reason for hard-wiring a system rather than using ‘wi-fi’ is because most types of wireless connection can fail to penetrate walls and metal obstructions. However, 5G is relayed between multiple small nodes and can re-route itself instantly if a passing tanker or crane blocks any particular path between devices. The technology is called ‘coordinated multi-point’ (CoMP).
Finally, 5G provides much improved network control, including the ability to subdivide the network. Known as ‘network slicing’, this means each virtual sub-net can be customised and optimised for multiple different purposes.
Whether public or private, 5G networks have applications everywhere. By planting sensors in the ground, farmers will know precisely how much water or fertiliser their crops need and when, or query weather satellites and predict their ideal harvest time and yield. Driverless machinery will often deliver it. The health of herds can be monitored remotely and assets tracked across the farm and supply chains.
The IoT has already demonstrated multiple applications in health and fitness. We are beginning to use proximity sensors and temperature sensitive cameras to track disease outbreaks. In the future 5G may be able to stop a public health threat in its tracks. Augmented reality may also facilitate remote examinations, benefitting people in isolation and the NHS system.
5G supports three rather different kinds of technology; smartphone broadband, large-scale IoT and critical ‘edge’ operations. Because smartphone makers need to sell handsets to pay for the public network, some of the more frivolous ‘benefits’ have been hyped. Many people will receive a Samsung S20 this Christmas and wonder what to do with it. However, the real revolution will be quieter and more impressive: few enterprises will be able to ignore 5G and still remain competitive.
Everyone has heard of artificial intelligence and smart connected devices; they might be fun one day, along with Robocop and jetpacks. However, most people don’t realise they are already here in a big way, in fact they probably use them every day.
Of course, most of us have already encountered Alexa: kids love it and kids are never wrong, but as yet many owners have used it for little other than turning on the lights or doing a hands-free internet search. It’s still not quite as impressive as HAL from the movie 2001, even though it is almost 2021, but there is a lot more to AI and IoT than one might think. According to networking giant Cisco, the number of ‘things’ connecting to the internet overtook the number of people way back in 2008.
A mobile phone is the one connected device almost everyone carries around with them, but we still tend think of them as “phones” rather than smart connected devices or artificial intelligence. In fact, most of them are both.
Many people will be surprised to know that more cars were added to mobile networks in 2016 than telephone handsets. Smart devices are not a novelty, they are already the norm, we have just not noticed.
Consider a stroll down your High Street. The many street cameras and other detectors you pass are probably feeding traffic information into an AI-powered management system. In some towns, they are managing the parking facilities, street lighting and bin collections. Almost every shop you pass has connected devices such as payment terminals, alarm systems, IP phones and CCTV. Some of these are some linked to a facial recognition AI.
Larger businesses along with clinics, hospitals and banks depend on an AI to protect them from network intrusions. Many takeaways will be connected to smart-ordering networks and delivery tracking systems. Uber cars and haulage vehicles rely on a logistics AI and the day is fast coming when AI will be driving them too.
Let’s return to your smart phone. Your mobile carrier’s network depends on artificial intelligence to route tracking signals, calls, Wi-Fi connections and SMS messages. Your camera relies on AI to focus, detect edges and adjust the contrast. Many of the apps you use connect with AI to provide other services, for example to monitor your fitness or detect the presence of COVID-19 infections.
If you need help with any of these apps, your enquiry will probably be answered by an AI bot. When you connect to the Internet, AI chooses the ads you see, the search results you get, the movies shown on Netflix and the music promoted on Spotify. If you upload a photo to Twitter or Facebook, facial recognition AI will probably analyse it to see who else is in the picture.
Figures from 2017, compiled by Gartner, showed 8.4 billion devices connecting over the IoT. That’s more than all the people in the world. This number is now around 20 billion as they are being deployed so quickly.
22% of IoT devices are inside factories; automating production lines, training robots, regulating conveying systems and ensuring quality control. Another 15% are specifically involved with energy efficiency management. Retailers currently account for just 12% of devices, for processes such as inventory tracking, footfall counters and security networks, while city management systems, such as traffic control, public transport and policing also use about 12%.
When device suppliers explain the potential benefits of the IoT they often use examples from our homes; central heating that knows how warm you like it, or ovens that switch on when you are on your way home. It is therefore surprising that building management still only accounts for about 3% of smart devices. There is still an enormous potential for growth in this area as well as for wearable devices, in wristbands, spectacles, headbands and integrated into our clothing. The recent craze for ‘Pokemon Go’ demonstrated the enormous popularity and potential of augmented reality.
The smartphone isn’t just a connected device, it is the device most of us depend on to monitor smart devices elsewhere. 5G networks will soon lead to an explosion in consumer-friendly utilities based on AI and the IoT, so phone manufacturers are now beginning to use chips optimised for AI (“neural engines”). The only limit is our imagination.
Never before has it been possible to collect so much data. However, the data is worthless until you can mine it for information, which in turn is useless unless you can understand it. It’s disappointing that most companies are still reliant on two-dimensional charts, graphs and tables of impenetrable figures. The underlying data is labour intensive to collect and processing it can take so long that actionable reports are often out of date.Finally, if you decide to respond in a particular way, you will have a further wait before you can evaluate the outcomes. Visual analytics combines the tools needed to perform all these steps but on a much faster timescale.
Briefly, your information resources can be collected automatically by sensors and cameras or by querying a wide variety of company data resources. Once you have a single point of access, data mining or similar pre-programmed algorithms can rapidly extract and organise it. You will then be able to extract meaningful correlations and aggregate key statistics. At the monitoring end, the salient information is provided in visual forms that human beings can understand at a glance and then respond swiftly.
Although they are complex, visual analytic systems are extremely flexible. If you can gather digital data on the activities or operations you need to monitor, you can apply visual analytic tools to them. This means it has a role to play in business, security, governance and on industrial production lines.
With many tools now available in the Cloud, it is within the reach of small to medium sized businesses for the first time. Many visual analytic systems are configurable using simple drag-and-drop interfaces, so although you need to understand your own operational requirements to design them, you don’t need skilled specialist IT teams to operate them for you.
If you can collect your data in real-time, such as from remote sensors and cameras linked across the IoT, then you can not only respond in real-time, but view the consequences of that response in real-time too. Many analytic suites also enable you to explore the consequences of a policy or production line change before you commit to it, as well as to identify historical trends.
Data science and statistical analysis isn’t something that everyone has time to understand, but pointing and clicking with a mouse is now commonplace. Visual analytic interfaces are designed for operational managers, to help them focus on their own areas of expertise and their own specific issues. Business managers and operational technicians can collaborate to devise solutions without needing to refer to IT specialists or external consultants.
Development is also simple. Sophisticated analytic systems can be built up without ever having to call in a coding team. Your solutions can be built in the Cloud, inside your intranet, or close to critical points in your operational infrastructure. The absence of a steep learning curve means there is a rapid return on investment for the company.
The flexibility of these systems enables, rather than replaces, human insight and experience. There are many areas where you can use these tools, guided only by your own creativity and imagination. In the course of exploring different data views you are very likely to discover answers to questions that you might never have thought to ask.
Before data analytics, if a report made you aware of a problem but didn’t explain the cause you would have to request more detailed information from front line departments and then wait for a further report. In contrast, visual analytics lets you explore a succession of views until you find the one that answers your question. You can dig deeper, or change the way you examine it with a single click until you find the view that makes the answer clear. As a result, your analyses are more thorough, more penetrating and, critically, up to date.
Analytic resources can be used to create a leaner, more agile enterprise, by making your front-line teams and managers more self-sufficient. Visual systems can reveal exactly where your production line bottlenecks are happening, or to predict where they are going to occur in the future. You can then make prompt adjustments to keep production flowing.
Access to a visual analytics dashboard can empower every member of your organisation by revealing exactly how their process is performing and whether it is keeping pace with other dependent processes. It can also be used to track management objectives, such as KPIs and other project landmarks.
Like sonar, ultrasonic transducers can detect detailed qualities of surfaces by bouncing sound waves off them. Today’s sensors are cheap, compact and have a vast range of applications.
The simplest ultrasound detectors in use are passive microphones but the majority of applications use a transducer - a device that combines a sound emitter and echo receiver. They are tuned to frequencies above 18 kHz, therefore inaudible to the human ear. Sound can be generated in a variety of ways but the most common methods are piezoelectric and capacitive (creating an electrostatic field between a back-plate and a diaphragm).
The vast majority operate in a straightforward way: they measure the time lapse between the signal generation and the arrival of its reflection. This simple principle can be tuned and adapted to achieve an impressive variety of functions. Sensors can detect and record speed, weather, size, material levels, numbers of items, condensation, contours and profiles, distance or proximity. Their targets can be large or tiny, near or distant, stationary or moving. When fitted in mobile vehicles and containers, geolocation reporting is often incorporated onto the same devices.
Some of the uses of proximity sensors are familiar - such as reversing vehicles and intruder alarms - but they have far more potential. Detectors can calculate accurate distances to the intercepted surface. The basic formula is simple: L = 1/2 × T × C - (L being the distance, T the time between transmission and reception, and C the speed of sound). Variations in the speed of sound can be allowed for as required and many sensors can work underwater or within other fluids where sound propagates differently. Detected surfaces can be irregular or diffuse, such as wire mesh, yet the sensors are resistant to interference from mist or dust.
This principle allows sensors to monitor the level of material in tanks and silos, highly valuable for any business reliant on continuous feedlines or just-in-time reordering. Ultrasound sensors can be set to detect a concise point or to average a broad irregular field of objects such as those in municipal recycling bins. Despite irregular contents, ultrasound sensors can issue automated alerts to the collection or refilling company, ensuring perfect logistical efficiency.
Sensors are also being deployed to monitor water levels in rivers and reservoirs, facilitating river management, water supplies, flood control and protecting natural habitats. Similarly, they can monitor levels when filling boxes and bottles or the cups in a drink dispenser.
Speed cameras use infra-red but discrete ultrasonic transducers can apply the same principle to estimate the speed and direction of moving objects. There are a variety of ways to implement this; sensors can emit a series of pulses, calculating the change in distance over a given time, multiple sensors can triangulate in several dimensions, and (in principle) a Doppler effect could be calculated from pitch modulations.
Engine and motor speeds can also be safely monitored using discrete acoustic sensors.
A basic requirement on many production lines, acoustic sensors have no difficulty calculating the exact number of items that pass through an acoustic signal. Sensors can also sort boxes of different sizes, providing independent counts; useful for economic packing into delivery vehicles.
Some of the applications are surprising, for example linking a sensor to a rotating anemometer and/or weather vane can transmit constant information about wind speed and direction for meteorological purposes. Locating them at entry and exit points can ensure car park or building occupancy is not exceeded or that all personnel have been evacuated.
Medical ultrasound scanners are familiar, but profile sensing is now highly versatile and affordable. It can be implemented into semi-automated or robotic assembly lines, or to support shaping and finishing processes. A simple, but vital, application is to monitor stacking. They can also accurately monitor roll diameters and coil winding and unwinding operations.
The possibilities for acoustic sensing are vastly extended by transmitting the output from smart sensors across the “internet of things”. Multiple outputs can then be combined in a single sophisticated web-based interface and monitored in real time from any convenient location. Smart sensors can then be easily interfaced with microcontrollers and actuators to provide sophisticated remote control and record details into enterprise management software.
Many businesses have yet to realise the wealth of opportunities that come from linking smart sensors to the IoT. Off-the-shelf devices are cheap and unobtrusive. For more demanding applications, bespoke algorithms can be implemented in the device firmware or in software at the collection point.
It is natural to find legionella bacteria in freshwater lakes and ponds. Some inevitably gets into domestic supplies and in very small quantities it is harmless. It is when it manages to breed and multiply inside our plumbing systems that it becomes dangerous. The lungs are particularly susceptible. Most people think of a pneumonia-like illness when you say legionella, but it can also cause a variety of problems including “Pontiac fever”, “Lochgoilhead” fever, septic shock and organ failure. Legionella is a killer.
Buildings locked-up or powered-down during the Covid-19 lockdown are now at increased risk because legionella prefers stagnant non-moving water and lukewarm temperatures.
A legionella infection is more likely in a large building such as an office block, public space, large factory or retail premises, but that does not mean there is zero risk in smaller buildings. In fact, even small landlords are bound by law to protect their tenants.
Big or small - the landlords, employers and managers of public buildings are responsible under a range of legislation. The main three are the Health and Safety at Work Act 1974. the Health and Safety at Work Regulations 1999, and the Substances Hazardous to Health Regulations 1999 (COSHH). Legionella precautions are also required or implied by numerous codes of practice, including HSE Approved Code of Practice ACOP L8 and HSE HSG274, DoE Health Technical Memoranda HTM 04-01 and HTM 01-05 (healthcare and dentistry), and in BS 7592 and BS 8580.
The greatest risks are usually associated with HVAC cooling towers, evaporative condensers, adiabatic coolers and all the associated pumps, tanks and pipework. However, premises of all sizes must comply with the same standards - including domestic landlords and letting agents. In small flats, cheap combi boilers that fail to deliver water at a stable hot temperature are a legionella hazard. High risks also come from stagnant water in shower outlets and clumsily capped pipework spurs.
The regulations apply to showers, vehicle washes, wet floor scrubbers, indoor water features, air washers, humidifiers, water softeners, chillers, spa pools, swimming pools, industrial quenchers, and to all hot and cold water supplies.
If a building occupant or visitor falls sick due to a legionella infection the consequences can be severe; evacuation, rehousing, lost income, new equipment, laboratory testing and compensation. Prevention really is better than the cure!
Begin by checking your basic system and identifying any problem points. It helps to have a schematic for a large building. In a small one, eliminate unused pipework and ensure the boiler/shower delivers a constant flow of hot water (without thermostats tripping on/off). The single most important factor in legionella prevention is to keep your cold water below 20 degrees centigrade, and your hot water close to 60 degrees centigrade or better.
You cannot guarantee compliance with those temperature requirements if you can’t monitor the temperatures throughout the system. To show compliance, you should document all your temperature readings and all your work on the system.
Regular readings and documentation could be a major headache, however there is a smart solution. Fitting wireless temperature sensors on pipes and at other key locations in the system can enable you to collect temperatures almost constantly and log them automatically. Today there are a range of inexpensive sensors - many so unobtrusive you will hardly know they are there. Most can be fitted externally with negligible inconvenience. Some incorporate accelerometers that also monitor the flow rates.
Smart sensors communicate using the IoT - the internet of things. You can then monitor what is happening via a Cloud dashboard from any computer or mobile phone at any time. If an issue is detected, the system will send you an alert by SMS or email so you don't have to sit there staring at it! Installing smart valves or smart thermostats into your plumbing system adds the ability to remote control the flow rates and temperature settings. In the long run, the opportunity to make water and fuel savings could pay for the system.
The L8 Approved Code of Practice suggests the entire water system should be reviewed every 1-2 years, but sooner when there are relevant changes. Those changes include plumbing work (because dirt can enter the pipework), periods of vacancy (because of stagnation), or temperature fluctuations (tripping thermostats). The presence of anyone at increased risk (with kidney disease, immune system impairment, the elderly and so forth) also constitutes a change meriting more frequent monitoring. If you install a smart connected system, inspections and reviews require very little work at all.
The last decade has seen huge advances in artificial intelligence, smart devices and video analytics. The next will see a dramatic increase in the devices built from them. In fact, demand will be so high that we need to start thinking about our capacity to deliver them.
One bottleneck is the networks over which we expect them to connect. As 5G rolls out, 4G is still patchy outside urban areas and the capacity of our networks to carry 5G traffic has been questioned. Its rollout was also somewhat muted by attacks on phone masts by protesters.
Data centres are also feeling the strain. As more companies, individuals and devices link to Cloud services, data centres have to increase capacity, but noise abatement and heat dissipation make expanding or finding new sites a challenge.
The irony is that only a few emerging technologies need an explosively growing network; demand seems to be driven by people rather than machines. Follow any link to a 4G or 5G website and you quickly discover the benefit of being able to download a 2hr movie in 10 seconds. A strange boast considering that almost everyone now streams, not downloads, movies (and we can’t help wondering why they need them on the move).
By comparison, a smart meter reports your gas and electricity usage about six times per day, taking about 3 seconds in total. Smart meters also use data maintaining their network but that only raises their usage to about 1 minute.
Only a few devices need to transmit more than a few kilobytes of information per hour, nothing comparable to a movie download. Visual feeds from cameras are heavier on bandwidth, but how many hours of CCTV footage of empty buildings do we really need to collect on central servers?
The IoT is a outstanding medium for data gathering and remote control; the Cloud is ideal for data storage and leasing advanced applications, but the most exciting frontier is the development of autonomous systems. When we can store sophisticated algorithms on a chip, smart devices are not only less dependent on human management, but also less dependent on networks. Problems such as communication interruption, bandwidth overload, and response latency begin to disappear.
The obvious example is the self-driving car. Not only are they heavily dependent on advanced image recognition but must perform it at a blistering speed. If they had to depend on a remote server for their analytics, they could never match the response times of human drivers. There are several other reasons for providing self-driving cars with a connection (traffic information for example) but the visual analytics that enable it to drive have to be local.
Video feeds are also a heavy load on human observers. CCTV security systems will be more effective when the equipment itself can identify salient events. In fact, the raison d'être for driverless cars is to improve on the situational awareness and sluggish responses of tired human drivers.
Cloud (or other network) dependence is the weak link in many IoT deployments, impairing its speed and reliability. The alternative is to distribute the processing workload close to the edge of the network - near the device. This is often called “Edge computing”.
Rapid situational awareness can often be achieved by incorporating AI or video recognition algorithms onto the device itself, or supplying them in a specialised processing unit in close proximity. This infrastructure can still work in symbiosis with distant resources and control systems, but the bulk of the processing is shifted as close as possible to where it is immediately needed.
In the next few years, real-time information response capabilities will find a multitude of new niches and transform existing ones. For example, video surveillance has been booming for years (in retailing, transport and security systems), but re-establishing those systems on edge architectures will transform their value by making the intelligence they collect actionable.
Knowing which bus ran you over might be useful in an inquest, but we would rather be warned that the bus is coming. Or consider the difference between scouring a police officer’s bodycam footage to see who fired at them, with a system that can recognise a gun and issue a warning that saves their life.
Ideal solutions will often be hybrid. Many systems can learn to recognise faces locally, for example at ATMs and robotic checkouts, yet they can still liaise with central repositories when needed.
Fully autonomous robots are no longer far-fetched, but in the meantime let Net 4 show you how to future proof your video processing systems.
The Environment Protection Agency recently declared that “Air pollution has a devastating impact on the UK population, shortening lives, causing early deaths and ill health. It is a bigger global killer than smoking. It costs the UK economy over £20 billion a year.” ( https://www.environmental-protection.org.uk/policy-areas/air-quality/air-pollution-law-and-policy/air-pollution-laws/.)
Common pollutants include ozone, sulphur oxides, nitrogen oxides, dioxins, polycyclic aromatics, carbon monoxide, ozone, particulates, ammonia, methane, hydrogen sulphide, chlorine, hydrogen chloride, hydrogen cyanide, phosphine and ethylene oxide. The consequences range from debilitating fatigue, headaches, hay-fever, skin disorders and eye irritation up to fatal illnesses including lung cancer, emphysema, asthma, COPD, bronchiolitis and cardio-vascular diseases. Polluted air has also been linked to mental illness, behaviour disorders, mental retardation and miscarriage.
The EU accused the UK of failing to comply with EU air quality regulations in 2017 ( http://europa.eu/rapid/press-release_IP-17-238_en.htm) and the UK government declared air pollution a national health emergency the following year ( https://publications.parliament.uk/pa/cm201719/cmselect/cmenvfru/433/433.pdf). A post-Brexit Environment Act is in the pipeline but it is unclear whether it will have any more effect than previous failed legislation.
Outdoors, traffic fumes overtook factory chimneys as the leading problem long ago, despite which many proposals for the new Environment Act still focus on “factory emissions” as did the last Clean Air Act in 1993. The proposals also largely ignore indoor sources of air pollution.
According to a study by the US EPA, indoor air quality is often 5 times worse than the air outside ( https://www.epa.gov/report-environment/indoor-air-quality). The main offenders are synthetic materials such as composite wood furnishings (which leech formaldehyde), synthetic carpeting, cosmetics, pesticides, office printers, photocopiers, asbestos-laden roof tiles, faulty aircon systems, domestic cleaning products, and (ironically) “air fresheners”.
Whether the new legislation addresses these problems or not, it is clear that employers, industrial facilities, office managers, local government and the public at large should be looking for solutions. We all breathe the same air and it is a significant hidden burden on our communities and productivity.
Most cities and towns have a few air monitoring stations, but their coverage is poor. They are also fixed in location (often the wrong ones) and the public have little access to their readings. As such they leave us a lot of guesswork.
Most people assume that pollutants rarefy as you get further from the source. That is not always the case - many roll into low-lying areas or form invisible clouds overhead that descend to ground level when the air temperature changes (for example at dusk). Air quality in specific areas can be substantially worse than thinly sprinkled monitors reveal. In short, to understand our air pollution problems and correct them, we need more monitors. That is equally true inside our homes and places of work, and outside in our city streets and countryside.
Before the IoT, better monitoring was impractical, but a wide range of air-quality sensors are now available. The IoT makes it easy to collect and monitor their data from almost anywhere. Detectors in fixed locations help us understand how conditions change over time, but we can also use mobile detectors to greatly extend our geographical coverage. If every council vehicle carried a monitor, blackspots would be discovered quickly and dealt with.
High-end devices are capable of establishing the parts-per-billion of a wide range of pollutants. Others focus on particular known hazards, such as nitrogen oxides or aromatics. At the cheaper end of the range, suitable for many domestic and industrial uses, sensors can provide a simple “red-amber-green” warning system about air particulates. They are increasingly popular with urban cyclists, and alert you to don a face mask.
The most common method for connecting an air quality sensor is a simple 3G or 4G SIM card. However, there are many systems for collecting transmissions. In some parts of the British Isles, notably Scotland so far, LoRa wireless networks are available. No matter how much sophistication you require in your sensors, the vast majority of systems report to a Cloud service where the data is accessible through a simple website interface.
Understanding the data you collect is made easy by proven off-the-peg tools such as the Tableau analytics platform. Visualisation tools make it easy to understand the results of your monitoring devices at a glance. If necessary, you can then cross-reference your readings with factors such as weather information.
The Breathlife2030 organisation has declared September 7th 2020 as the first “International Day of Clean Air for Blues Skies”. There is no better time to be looking at IoT air quality tools than today.