Article 1

The Nuclear Option

Stuart Sharrock

I used to be a nuclear physicist; still am presumably. I don’t know how you can stop being a nuclear physicist. Certainly you can stop being a practising nuclear physicist, as I did nearly half a century ago, but you can’t escape from the approach to problems and the outlook on life that a scientific training implants. Nuclear physics in particular instils an analytical methodology tempered by the realisation that the world of quantum mechanics is counterintuitive – you shouldn’t assume anything in a probabilistic world where entities can be in multiple places simultaneously.

 


Article 2

Key Considerations in the Development of an IoT Architectural Framework

Oleg Logvinov

The ongoing convergence of operations technology (OT) and information technology (IT) is playing a key role in driving IoT adoption across a wide range of industries. Information technology has been applied in operations increasingly over the past thirty years. It has been applied in operational equipment, and in more recent years information technology has been used to integrate operational departments such as manufacturing, order entry, accounts receivable and payable, general ledger, purchasing, warehousing, transportation and human resources. Still, that process faces its own issues, as the analyst firm Gartner claims, "The relationship between IT and OT groups needs to be managed better, but more importantly, the nature of the OT systems is changing, so that the underlying technology – such as platforms, software, security and communications – is becoming more like IT systems."

 


Article 3

Connected versus Intelligent Devices in the IoT – and in Saunas

Aapo Markkanen

One of the most profound questions affecting the Internet of Things at the moment is where the smarts in smart systems will reside. The first phase of the IoT – an extension from its terminological precursor, M2M – has been based on the premise that the device itself is rudimentary and any intelligence in it comes from the cloud level. Moreover, in many cases “intelligence” has not been a priority to begin with, and the application has been developed to deliver only remote control or servicing initiated by a human operator, without any real need to capture and process data.

 


Article 4

IoT: Future-proofing Device Communications

Scott Lofgren

The Internet of Things (IoT) is a real game-changer and has the potential to transform and improve our lives, bringing with it the combination of connected devices and intelligent data. The industry is already making strides towards increased interconnectivity with research and analysis firm, IDC, estimating that IoT spending will grow to $8.9 trillion by 2020. IDC also expects the installed base of the IoT to grow to 212 billion "things" globally by the end of 2020, which will include 30.1 billion installed "connected (autonomous) things" driven by smart systems that will be installed and collect data across both consumer and enterprise applications.

 

 

This Month's Contributors

Stuart Sharrock has been working as an analyst and consultant in the telecommunications industry for the past three decades.
Read More >>

Oleg Logvinov is the Director of Special Assignments in STMicroelectronics’ Industrial & Power Conversion Division.
Read More >>

Principal analyst Aapo Markkanen leads ABI Research's Internet of Everything Research Service, contributing to various research activities related to Internet of Things, M2M, and big data.
Read More >>

Scott Lofgren serves as the UPnP Forum President, as well as participating in many of the working committees.
Read More >>

 

Contributions Welcomed
Click Here for Author's Guidelines >>

 

Would you like more information? Have any questions? Please contact:

Raffaele Giaffreda, Editor-in-Chief
raffaele.giaffreda@create-net.org

Stuart Sharrock, Managing Editor
stuartsharrock@ieee.org

 

About the IoT eNewsletter

The IEEE Internet of Things (IoT) eNewsletter is a bi-monthly online publication that features practical and timely technical information and forward-looking commentary on IoT developments and deployments around the world. Designed to bring clarity to global IoT-related activities and developments and foster greater understanding and collaboration between diverse stakeholders, the IEEE IoT eNewsletter provides a broad view by bringing together diverse experts, thought leaders, and decision-makers to exchange information and discuss IoT-related issues.

IoT: Future-proofing Device Communications

Scott Lofgren
March 10, 2015

 

The Internet of Things (IoT) is a real game-changer and has the potential to transform and improve our lives, bringing with it the combination of connected devices and intelligent data. The industry is already making strides towards increased interconnectivity with research and analysis firm, IDC, estimating that IoT spending will grow to $8.9 trillion by 2020[1]. IDC also expects the installed base of the IoT to grow to 212 billion "things" globally by the end of 2020, which will include 30.1 billion installed "connected (autonomous) things" driven by smart systems that will be installed and collect data across both consumer and enterprise applications.

The rapidly increasing number of devices – in the home and workplace – and the growing delivery of low power interconnectivity are undoubtedly factors driving the IoT revolution. Coupled with the need for multi-vendor interoperability, and ease of use and self-install, IoT provides a tremendous opportunity for innovation and growth by bringing together people, processes, data and devices.

Creating harmonization

As exciting as the overall opportunity for the IoT market is, harmonizing the growing number of vertical segments is paramount. Owing to the previous absence of standardization and with convenience and cost in mind, many IoT projects were built vertically. Devices and connectivity were provided by a single vendor, with little or no consideration for interoperability with products from other vendors, leading to fragmentation of the market. The lack of compatibility hobbled network flexibility and functionality, ultimately limiting consumer choice in the rapidly emerging market.

All of the greatest IoT opportunities – from the connected home, out-and-about mobile interactions, smart meters, the connected car, and smart grid to personal wellness and connected health – have been driven from a vertical market perspective. However, the Internet of Things requires interoperability over the internet regardless of verticals. Nobody wants different solutions for their smart home, mobile phone and connected car. By definition all the ‘things’ should be internetworked.

Furthermore, users expect fully secure and private access to data and applications anywhere, anytime and on any platform. They increasingly expect to read, analyse and control every network or device from every other network or device. Only the individual user can determine which software versions and operating systems are used. Without multi-vendor interoperability, things might simply come to a grinding halt and the user experience will be, to say the least, unsatisfactory.

To become truly successful, the IoT requires an open means for allowing devices to find one another and communicate. Data has to be able to flow freely between countless applications and platforms onto any device. In telecom, we’re witnessing a vast uptake of mobile devices, Web 2.0 software and social networking. For office networks, it’s the uptake of portable computing, apps and the introduction of new – mobile and offsite – ways of working.

The role of UPnP

To address the vertical challenge, manufacturers need to agree on a limited number of open standards. With billions of devices already deployed and open source implementations on virtually every operating system and in every programming language, UPnP technology is already one of these key standards. The UPnP standard is vendor-neutral and already provides the foundation to complement a variety of management gateways and device control scenarios, incorporating well-vetted mechanisms for security, discovery and service advertisement. UPnP core technology provides a base for IoT, creating bridges to both wide-area networks and non-IP devices. With the recent introduction of UPnP+ by the UPnP Forum, the technology is ready to lead the way for the IoT.

UPnP+ is an evolution of previous UPnP capabilities that will assist devices attempting to integrate new paradigms, such as mobile connected computing, cloud-based service delivery, smartphone content sharing, and the IoT. It provides an improved, seamless experience for the consumer and creates new values and opportunities for manufacturers, developers and integrators. It leverages existing UPnP protocols and takes them into the cloud to bridge the IoT, while continuing to support legacy UPnP devices.

It supports the implementation of web browser controls for a wide range of functions, ensuring future connectivity and making new services possible in areas such as health and fitness, energy management, sensor management, security and sustainability. UPnP+ incorporates IPv6 and has capabilities for discovering cloud services and new grouping/pairing capabilities. It includes a host of enhancements and delivers an improved baseline for interoperability.

The next-generation UPnP+ standard takes a step beyond consumer media devices and is focused on delivering new technical capabilities to enhance product functionality and provide a more sophisticated, intuitive and deeper user experience across platforms. It takes a step forward to address tomorrow’s connectivity requirements to remove boundaries and enable full device and network compatibility for new and exciting experiences.

Achieving total interconnectivity

The pace of development in IoT is astonishing and represents a huge opportunity for the industry as a whole. However the lack of standards and interoperability between vendors, combined with the proliferation of innovative services, leaves many users flipping between a multitude of applications, or frustrated at the lack of support on their platforms.

As more and more connected devices join the IoT ecosystem, the industry needs to focus on providing safe, reliable interoperable access to services and information regardless of the vertical segment or vendor. Inter-device standardization is a vital requirement and with UPnP+, total interconnectivity and limitless functionality can be achieved, enabling the Internet of Things to reach its potential.

 

[1] IDC October 2013, http://www.businesswire.com/news/home/20131003005687/en/Internet-Poised-Change-IDC#.VNiUdZ2sUjo

 


 

Scott LofgrenScott Lofgren serves as the UPnP Forum President, as well as participating in many of the working committees. He is a 29-year Intel veteran, reporting to the Intel NTG CTO office. He has held other industry consortia positions, including DLNA Board and Directors Alternate and Advisory Council Chair, OIPF Board of Directors, and Founder and Chairman of the EoU PC Quality Roundtable.

 

 

Connected versus Intelligent Devices in the IoT – and in Saunas

Aapo Markkanen
March 10, 2015

 

One of the most profound questions affecting the Internet of Things at the moment is where the smarts in smart systems will reside. The first phase of the IoT – an extension from its terminological precursor, M2M – has been based on the premise that the device itself is rudimentary and any intelligence in it comes from the cloud level. Moreover, in many cases “intelligence” has not been a priority to begin with, and the application has been developed to deliver only remote control or servicing initiated by a human operator, without any real need to capture and process data.

We at ABI Research refer to this approach as the connected device paradigm. For players involved in M2M/IoT it has been more of a necessity than a choice. The connectivity problem simply has been solved before the computing problem, so it made sense to design the early systems in this way.

Advances to computing and power consumption, however, are beginning to pull intelligence from the cloud to the network edge. The increase in so-called edge intelligence, in turn, is making the available architecture choices more nuanced and allowing organizations deploying the IoT to enhance their physical assets and processes in novel ways. As a result, the industry is now on the verge of the intelligent device paradigm. It could be summarized to come with three advantages:

Latency: Processing data at the edge reduces latency from the actions. Use cases that are highly time-sensitive and require immediate analysis of, or response to, the collected sensor data are, in general, unfeasible under cloud-centric IoT architectures, especially if the data are sent over long distances.

Security: By and large, sensitive and business-critical operational data are safer when encrypted adequately at the edge. Unintelligent devices transmitting frequent and badly secured payloads to the cloud are more vulnerable to hacking and interception. Additionally, enterprises may need to secure and control their machine data on the pre-cloud level for compliance reasons.

Total Cost of Ownership: The paradigm shift can reduce the IoT systems’ total cost of ownership, or TCO. Intelligent devices are more expensive upfront than less sophisticated alternatives, but their TCO over a long service life can prove substantially lower, owing to reduced data service costs and extended battery life.

The edge intelligence can take place on the endpoint device as well as on routers, switches, and other gateway/hub devices. The latter method has the benefits of having a less constrained form factor and often access to the power grid, so the chances are that for the time being most IoT practitioners will find it to be the more compelling of the two.

The gateway model, in which the gateway device communicates with endpoints over a short-range wireless technology, appears to be gaining traction in both industrial and consumer-facing settings. The ongoing innovation in mesh networking is likely to give a further boost to gateway-driven intelligence over the next couple of years, by opening up more extensive short-range designs.

At the same time, it is important to understand that the cloud level, as such, will by no means be going away. Rather, it is likely that the IoT will reshape the cloud as a concept, and make it something more distributed than it has been during the “digital-first” internet era, with its vast and centralized data centers. For the physical-first Things, the cloud may well turn out to be a network of local (city-level) or even hyper-local (neighborhood-level) layers that deals with the data as close to the source as viable.

Ultimately, no single IoT architecture can fully address all possible use cases. The key learning here should be that organizations betting on the IoT are finally starting to have real technology choices at their disposal, and those choices should depend on the characteristics of the use case – where and when the data need to be processed, what the security requirements are, and what the estimated TCO looks like. For some use cases, the best set-up is still more about being merely connected, whereas for some others there’s a case for investing in smartening Things up and making them truly intelligent.

Sauna stoves

To demonstrate how the difference between “connected” and “intelligent” device paradigms plays out in real life, let me cite a carefully selected, if potentially unorthodox, example from the consumer market and tell you a little about sauna stoves. On a global level, the sauna stove is an admittedly niche product category, yet in my native Finland it represents a market with a remarkably high installed base. According to the most reliable estimates, there are slightly over two million saunas in the country, against a total of 2.5 million households. Each sauna has one stove, which is heated by wood or electricity.

In recent years, the Finnish sauna industry – starting from its high end – has been experiencing an increasing uptake of connected sauna stoves that can be turned on and off remotely, usually from a mobile application. In this paradigm, the sauna itself remains fairly dumb and unaware of its physical context, so while the connectivity element provides a nice dose of everyday convenience for the end user it poses also a serious risk for life and property if used carelessly.

See, many sauna users tend to use the stove not only for bathing purposes but also for drying sports gear and other personal accessories on the rocks that cover the stove’s fireplace or electric heating elements. Naturally, the latter, secondary use case can be applied safely only when the stove is turned off. If the rocks are hosting anything inflammable when the stove is on, the drying process turns into an immediate fire hazard. This makes the primary use case problematic when cloud-enabled remote control is added to the equation, since as a user interface it is inherently riskier than the traditional one that requires the user to initiate the heating by applying a match or a switch in the same physical space as the device.

According to the national safety authority, fire incidents caused by saunas are on the rise in Finland (from 52 in 2010 to 156 in 2013), after declining for years, and based on its study the trend can be largely attributed to the growing popularity of remote control. This adds an exotic flavor to the matter of “IoT security”, and goes to show rather tangibly how equipping a physical-first device with connectivity, but with no intelligence, can have very negative consequences.

In this example, the solution to the problem is easy to see. Ideally, a sauna stove would have a set of sensors that prevent remote control if they detect anything on the rocks that does not belong there. The data readings would be processed either by the stove itself (i.e., the endpoint) or by a smart-home hub (i.e., a gateway), and besides the improved product security the paradigm shift could add further value. Adjusting the heating cycle (to factor in a prolonged evening run, interworking with a wearable activity tracker), or the temperature (to factor in different preferences amongst the bathers, interworking with a smart showerhead), are examples of such next-generation features that spring first to my mind.

Sadly, though, even the intelligent and learning sauna does have one fundamental downside. It is not compatible with wood-burning stoves, which in terms of heat quality are superior to electrically heated ones. The ideal user interface does not always equal the ideal user experience.

 


 

Oleg LogvinovPrincipal analyst Aapo Markkanen leads ABI Research's Internet of Everything Research Service, contributing to various research activities related to Internet of Things, M2M, and big data. In his research, he explores areas such as predictive analytics, product lifecycle, quantified self, contextual awareness, cloud platforms, and IoT developers. Before joining ABI Research, Aapo worked as an analyst at IHS, where he was responsible for providing market intelligence and strategic analysis on the European telecoms sector and its leading players. He holds BSc and MSc degrees in management studies from the University of Tampere, Finland.

 

 

Key Considerations in the Development of an IoT Architectural Framework

Oleg Logvinov
March 10, 2015

 

The ongoing convergence of operations technology (OT) and information technology (IT) is playing a key role in driving IoT adoption across a wide range of industries. Information technology has been applied in operations increasingly over the past thirty years. It has been applied in operational equipment, and in more recent years information technology has been used to integrate operational departments such as manufacturing, order entry, accounts receivable and payable, general ledger, purchasing, warehousing, transportation and human resources. Still, that process faces its own issues, as the analyst firm Gartner claims, "The relationship between IT and OT groups needs to be managed better, but more importantly, the nature of the OT systems is changing, so that the underlying technology – such as platforms, software, security and communications – is becoming more like IT systems."[1]

This OT/IT convergence is taking place across a number of vertical industries even while they continue to independently develop devices, systems, and applications to leverage the benefits of IoT in their respective fields. At the same time, the trend to better utilize more unified platforms is emerging, revealing the beauty of the IoT: joining previously independent vertical segments, to produce a broader, standardized, multi-domain unified platform – one that promises to streamline and even incentivize IoT development and implementation.

Numerous examples prove the point. eHealth was originally set up to serve the medical field, and has quickly become a platform with powerful crossover applicability easily adapted for Smart Me and Smart Home applications. Smart Cities can now support electric vehicles being charged by power grids distributed throughout the city. These scenarios demonstrate the horizontal benefits of IoT solutions by delivering compelling benefits to multiple verticals while enriching the benefits by providing informational crosspollination. Smart cities, homes, and workplaces; e-health; resilient, self-healing power grids; digital factories; cleaner transportation; immersive entertainment – these are just a few areas of economic opportunity to benefit from the increased interoperability and portability that a standardized IoT architecture brings. This explains the general consensus that strengthening the horizontal nature of the IoT is beneficial to pursue. So, what key considerations should we weigh?

IEEE P2413

The IEEE Standard in development, P2413 – Standard for an Architectural Framework for the Internet of Things – builds on the horizontal value of the IoT by recognizing the evolving transformational integration and convergence across technology and application domains as the starting point for an extensible integrated architectural framework. The P2413 Working Group has identified that doing this effectively requires a blueprint for data abstraction and the quality "quadruple trust” that includes protection, security, privacy, and safety.

Power consumption due to communication is rapidly moving into the spotlight, IoT devices in many cases have limited power resources and need to conserve energy to prolong battery life or minimize power-supply requirements. Recently a new trend, hybrid networking, has emerged. The IEEE 1905.1 was developed for the gaming and infotainment sector to allow devices to independently determine and select the best network protocol and media for data transmission. It is only natural to imagine the next step and include power-consumption metrics that would enable choosing the best network protocol for data transmission under varying power-consumption scenarios are associated with specific application requirements.

The reality is that as the IoT develops – and produces a significant increase in data flows among the devices and subsequently an increase in power consumption due to transmission of application-related data – having devices and applications capable of hybrid network access will be key to maximizing energy conservation. What’s more, the protocols themselves could also be adapted for lower power consumption so that current ones, such as Wi-Fi, which are designed for live, “continuous” communication to maintain network connectivity, can be enhanced to consume much less power when they are simply operating to avoid being “dropped” from the network.

Greater application awareness is another related concern demanding attention. As sensors and monitoring devices become more capable of handling multiple monitoring functions, identifying and prioritizing the different data elements being gathered and analyzed will allow transmitting data at the lowest power consumption, and with the necessary quality of service. Fostering increased application awareness fits well within the goals of IEEE P2413, where it can play a major role by enabling cross-domain interaction and platform unification through increased system compatibility, interoperability, and functional exchangeability.

Overarching architectural framework

Today, there are many groups working independently to develop IoT standards, each demonstrating great examples of innovation. Many of these initiatives are often somewhat specialized. IEEE P2413 defines an overarching architectural framework to promote cross-domain interaction with the goal to aid system interoperability and functional compatibility, and further fuel the growth of the IoT market. The architecture and reference model for IEEE P2413, designed to reflect the convergence of IT and OT as well as the horizontal nature of IoT, has been created from the onset to provide a simplified path forwards to create universal standards for the IoT as a whole.

There is sound reasoning behind the development of IEEE P2413. When looking to past standards initiatives, it’s clear that what’s needed to create viable global standards for the IoT is a conscious and continual collaboration between all parties involved. In effect, IEEE P2413 can become a great meeting place for many industry groups to join forces and work together in order to build ecosystems that successfully and effectively leverage the power of all “things” and the body of work that already exists today.

We are at an evolutionary crossroads in shaping and driving the IoT forward and moving quickly toward an increasingly connected world and the benefits it can bring makes this a historically exciting time. As the IoT moves into its next stage it’s clear that next-generation applications will require a data abstraction blueprint and a set of basic building blocks that can easily create multi-tiered systems. By providing these common elements, IEEE P2413 will help minimize industry and vertical market fragmentation, ease implementation of cross-domain applications, and ensure the IoT achieves its full potential on a global scale.

 

[1] http://www.gartner.com/newsroom/id/1590814

 


 

Oleg LogvinovOleg Logvinov is the Director of Special Assignments in STMicroelectronics’ Industrial & Power Conversion Division. Mr. Logvinov is also chair of the IEEE P2413 "Standard for an Architectural Framework for the Internet of Things" Working Group. He currently serves on the IEEE Standards Association (IEEE-SA) Corporate Advisory Group and the IEEE-SA Standards Board. He helped found the HomePlug Powerline Alliance and is the past President and CTO of the Alliance. During the last 25 years Mr. Logvinov has held senior technical and executive management positions in the telecommunications and semiconductor industry. Mr. Logvinov has nineteen patents to his credit.

 

 

The Nuclear Option

Stuart Sharrock
March 10, 2015

 

I used to be a nuclear physicist; still am presumably. I don’t know how you can stop being a nuclear physicist. Certainly you can stop being a practising nuclear physicist, as I did nearly half a century ago, but you can’t escape from the approach to problems and the outlook on life that a scientific training implants. Nuclear physics in particular instils an analytical methodology tempered by the realisation that the world of quantum mechanics is counterintuitive – you shouldn’t assume anything in a probabilistic world where entities can be in multiple places simultaneously.

But for me nuclear physics brought more. I was not only fortunate enough to be taught by some of the leading pioneers in the field but also grew up within an academic community of physicists who had been involved with the Manhattan Project, the research and development project that produced the first atom bombs during World War II. That exposed me to the consequences that can result when the relationship between the scientific community and the world of national and international politics comes under stress.

Many scientists involved with the Manhattan Project expressed their fears and uncertainties about the effects of atomic warfare long before the United States dropped the first bomb on Hiroshima. They even argued that control of nuclear energy should be out of the hands of the state and made strenuous attempts to prevent atomic warfare from ever taking place. They not only failed but many were subsequently persecuted for their stance.

The experience was traumatic. Many of the scientists felt betrayed, some had nervous breakdowns, a few turned to espionage, and a number committed suicide. They had made monumental advances in understanding and created phenomenal capabilities but had lost all control over the use that could be made of their discoveries.

One positive consequence of that experience was that the research community resolved to create an international laboratory to bring together scientists from different nations to collaborate on further research in nuclear physics. The result was CERN, a European laboratory that brought together research groups from East and West – at the height of the Cold War!

When scientists understand the potential threats posed by their discoveries and policy makers do not, there is a dangerous disconnect – joint ventures such as CERN go some way towards countering such dangers. They foster mutual understanding between scientists from different nations. But there is still a need for dialogue between scientists and policy makers.

Net neutrality

Another dangerous disconnect is apparent today. The principle that internet service providers and governments should treat all data on the internet equally – the so-called net neutrality debate – has been nothing if not controversial. An astonishing 3.7 million comments concerning the issue have been filed with the FCC. But do politicians and policy makers really understand these network issues? Arguments presented within the net neutrality debate indicate clearly that they do not. And this is hardly surprising; members of Congress with a background in or knowledge of science are exceedingly rare animals – some observers claim the species is already extinct. Engineers, who have the most to say about how to manage networks, long ago left the debate, leaving networks to be designed by lawyers.

The internet was planned to be anonymous. Communications engineers understand the potential threats to privacy and security posed by an internet that was designed deliberately without an identity layer. Politicians and policy makers are unconcerned by such technical detail. They have a different perspective and a very different agenda. As a result their relationship with the communications community is becoming increasingly stressful.

Does this matter? I think it does. The need for an open interchange of ideas between the communications industry and policy makers has perhaps never been greater. Recent disclosures about the activities of government security agencies and massively resourced cybersecurity attacks illustrate the urgent need for such a dialogue.

Is the communications industry facing its own Manhattan Project moment? Perhaps not now but it could be just around the corner.

The IoT

The World Wide Web was invented at CERN. It launched the internet from being an academic research network into a global public resource. The subsequent impact of the internet has already been phenomenal but is now set to reach unprecedented levels as the Internet of Things adds new dimensions of interconnectivity.

Citizens in the UK have been told not to worry. The very vastness of the data collections concerned imposes a comforting level of protection. No-one, say the politicians, could possibly ever have the time to trawl through all that lot.

That's not reassuring at all. People will not do the trawling, machines will. And the machines will analyse the data using increasingly complex algorithms designed by other machines deploying increasingly sophisticated artificial intelligence. What rules such systems will obey and what actions they will trigger should surely be a matter for public debate.

And the emphasis should be on debate. The potential benefits of the IoT are so enormous that people waving warning flags can be accused of scaremongering, of raising the spectre of an Orwellian society in an attempt to stifle the adoption of IoT solutions. This is not what we want. What we are highlighting is the need to educate people about responsible risk assessment. If we fail to remain alert to the dangers of unethical behaviour then we sink to the level of certain paranoid psychopaths working in the investment banking industry – and just look at what happened there.

The ability to observe, capture and store almost every interaction between humans and devices has significant consequences for privacy and security. The ability and intent to extract correlations by mining vast quantities of data has significant implications for society and justice. But correlations do not demonstrate causality and a propensity to commit an offence is not a crime. Or is it? The internet lacks an identity layer; the IoT lacks an ethical layer.

I would argue that these issues deserve serious consideration. It would be irresponsible to enter the realms of complex algorithm-driven business models devised by increasingly sophisticated artificial intelligence and delivered by robots without being clear about what society regards as acceptable and what it does not. Remember: machines have no ethics.

 


 

Stuart SharrockStuart Sharrock has been working as an analyst and consultant in the telecommunications industry for the past three decades. He holds a BSc in Natural Philosophy from Edinburgh University and a PhD in Nuclear Physics from University College London and has conducted research in nuclear physics in the Soviet Union, Switzerland, the UK and the USA. He lectured in physics at UCL before entering the commercial world as the Physical Sciences Editor of Nature, the world's leading peer reviewed scientific journal, and subsequently as a publisher of scientific books and journals for a number of major publishing houses before becoming an independent analyst and consultant. Stuart is a member of the IEEE and the Managing Editor of the IEEE IoT eNewsletter.