May 17, 2020

Why the time is ripe for wearables in the enterprise

B2B
CIO
IoT
wearables
Neil Bramley
4 min
Wearable devices
The idea of integrating wearable technology into enterprise IT infrastructure is one which, while being mooted for several years now, has yet to take-of...

The idea of integrating wearable technology into enterprise IT infrastructure is one which, while being mooted for several years now, has yet to take-off in earnest. The reasons behind previous false dawns vary. However, what is evident is that – regardless of whether wearables to date have lacked the mobility or security capabilities to fully support the ways in which we now work – organisations remain keen and willing to unlock the potential such devices have. According to ABI Research, global wearable device shipments will reach 154 million by 2021 – a significant jump from approximately 34 million in 2016.

Ripening conditions for wearable growth

This projected increase demonstrates a confidence amongst CIOs which perhaps betrays the lack of success in the market to date, but at the same time reflects a ripening of conditions which could make 2018 the year in which wearables finally take off in the enterprise. A maturing IoT market, advances in the development of Augmented Reality (AR), and the impending arrival of 5G – which is estimated to have a subscription base of half a billion by 2022 – are contributing factors which will drive the capabilities of wearable devices.

See also:

Perhaps the most significant catalyst behind wearables is the rise of Edge Computing. As the IoT market continues to thrive, so too must IT managers be able to securely and efficiently address the vast amounts of data generated by it. Edge Computing helps organisations to resolve this challenge, while at the same time enabling new methods of gathering, analysing and redistributing data and derived intelligence. Processing data at the edge reduces strain on the cloud so users can be more selective of the data they send to the network core. Such an approach also makes it easier for cyber-attacks to be identified at an early stage and restricted to a device at the edge. Data can then be scanned and encrypted before it is sent to the core.

As more and more wearable devices and applications are developed with business efficiency and enablement in mind, Edge Computing’s role will become increasingly valuable – helping organisations to achieve $2 trillion in extra benefits over the next five years, according to Equinix and IDC research.

Where will wearables have an impact?

At the same time as these technological developments are aiding the rise of wearables, so too are CIOs across various sectors recognising how they can best use these devices to enhance mobile productivity within their organisation – another factor which is helping to solidify the market. In particular it is industries with a heavy reliance on frontline and field workers – such as logistics, manufacturing, warehousing and healthcare – which are adopting solutions like AR smart glasses. The use case for each is specific to the sector, or even the organisation itself, but this flexibility is often what makes such devices so appealing. While wearables for the more traditional office worker may offer a different but no more efficient way for workers to conduct every day tasks such as checking emails and answering phone calls, for frontline and field workers they are being tailored to meet their unique demands and enhance their ability to perform specific tasks.

Take for example boiler engineers conducting an annual service, who could potentially use AR smart glasses to overlay the schematics of the boiler to enable a hands-free view of service procedures – meaning that when a fault becomes a barrier to repair, the engineer is able to use collaboration software to call for assistance from a remote expert. Elsewhere, in the healthcare sector smart eyewear may support clinicians with hands-free identification of patient records, medical procedures and information on medicines and results.

Such examples demonstrate the immediate and diverse potential of wearables across different verticals. With enterprise IT infrastructure now in the position to embrace such technologies, it is this ability to deliver bespoke functionality to mobile workers which will be the catalyst for continued uptake throughout 2018 and beyond.

Neil Bramley, B2B Client Solutions Business Unit Director, Toshiba Northern Europe

Share article

Jun 11, 2021

Google AI Designs Next-Gen Chips In Under 6 Hours

Google
AI
Manufacturing
semiconductor
3 min
Google AI’s deep reinforcement learning algorithms can optimise chip floor plans exponentially faster than their human counterparts

In a Google-Nature paper published on Wednesday, the company announced that AI will be able to design chips in less than six hours. Humans currently take months to design and layout the intricate chip wiring. Although the tech giant has been working in silence on the technology for years, this is the first time that AI-optimised chips have hit the mainstream—and that the company will sell the result as a commercial product. 

 

“Our method has been used in production to design the next generation of Google TPU (tensor processing unit chips)”, the paper’s authors, Azalea Mirhoseini and Anna Goldie wrote. The TPU v4 chips are the fastest Google system ever launched. “If you’re trying to train a large AI/ML system, and you’re using Google’s TensorFlow, this will be a big deal”, said Jack Gold, President and Principal Analyst at J.Gold Associates

 

Training the Algorithm 

In a process called reinforcement learning, Google engineers used a set of 10,000 chip floor plans to train the AI. Each example chip was assigned a score of sorts based on its efficiency and power usage, which the algorithm then used to distinguish between “good” and “bad” layouts. The more layouts it examines, the better it can generate versions of its own. 

 

Designing floor plans, or the optimal layouts for a chip’s sub-systems, takes intense human effort. Yet floorplanning is similar to an elaborate game. It has rules, patterns, and logic. In fact, just like chess or Go, it’s the ideal task for machine learning. Machines, after all, don’t follow the same constraints or in-built conditions that humans do; they follow logic, not preconception of what a chip should look like. And this has allowed AI to optimise the latest chips in a way we never could. 

 

As a result, AI-generated layouts look quite different to what a human would design. Instead of being neat and ordered, they look slightly more haphazard. Blurred photos of the carefully guarded chip designs show a slightly more chaotic wiring layout—but no one is questioning its efficiency. In fact, Google is starting to evaluate how it could use AI in architecture exploration and other cognitively intense tasks. 

 

Major Implications for the Semiconductor Sector 

Part of what’s impressive about Google’s breakthrough is that it could throw Moore’s Law, the axion that the number of transistors on a chip doubles every five years, out the window. The physical difficulty of squeezing more CPUs, GPUs, and memory on tiny silicon die will still exist, but AI optimisation may help speed up chip performance.

 

Any chance that AI can help speed up current chip production is welcome news. Though the U.S. Senate recently passed a US$52bn bill to supercharge domestic semiconductor supply chains, its largest tech firms remain far behind. According to Holger Mueller, principal analyst at Constellation Research, “the faster and cheaper AI will win in business and government, including with the military”. 

 

All in all, AI chip optimisation could allow Google to pull ahead of its competitors such as AWS and Microsoft. And if we can speed up workflows, design better chips, and use humans to solve more complex, fluid, wicked problems, that’s a win—for the tech world and for society. 

 

 

Share article