Infrastructure efficiency

SMEDI’s civil engineering BIM simplifies project complexity

15 November 2015

3 min read

The Shanghai Municipal Engineering Design Institute (SMEDI), one of China’s top municipal engineering companies, has completed 12,000 projects including water treatment plants, as well as road, bridge, rail, urban landscape, fuel gas and geotechnical engineering projects. Compass spoke with Lv Wei Zhang, association chief engineer in SMEDI’s IT Center, and Junwei Wu, deputy director of SMEDI’s BIM Center, about their work to develop IT solutions for civil engineering’s unique challenges.

COMPASS: What challenges are SMEDI facing in executing its work?

LV WEI ZHANG: In China, it is common for major infrastructure projects to be carried out with design and construction happening in parallel. Typically, only 50% of the project is designed when construction begins. During construction, owners are able to plan the rest of the project with greater precision. So they modify their design as the project evolves. This is one of the ways to adjust projects.

This process is close to owners but very difficult for the designers. To succeed, we must be able to clearly visualize the outcome of our design to ensure both quality and efficiency. With an advanced Building Information Modeling (BIM) platform, we can improve communication between owners, make design changes with great flexibility, manage project status with precision and efficiency and recover from project delays effectively.

Before employing the advanced BIM platform, what difficulties did you encounter in your work?

LWZ: In the past, our contractors used the in-situ casting method quite extensively, with a lot of casting work happening at the site. This had numerous drawbacks. First, it was difficult to control material waste. Second, it was hard to manage cost. Third, managing time and schedule was a big challenge. Last, casting on-site occupied much more space than prefabrication would require, so other contractors were often blocked from their work sites for prolonged periods.

How did you solve this challenge?

LWZ: First of all, we fully integrated our work into an engineering procurement construction (EPC) system that provides an overview of engineering, procurement and construction and how they relate to one another. We did off-site prefabrication as much as possible, and we launched a BIM system, which significantly enhances overall efficiency.

What are the benefits of BIM?

LWZ: On the one hand, BIM enables us to achieve collaborative 3D design. The designs, from the macro system to the micro parts, are displayed as 3D visuals, giving clarity and precision in the process of communication with all stakeholders. BIM also facilitates data communication in an industry standardized format, so that everyone sees the same information clearly.

Could you briefly describe your application of BIM?

JUNWEI WU: Starting in 2005, the civil engineering industry in China has been shifting from CAD to BIM, and we started using BIM in our design work at that time. Before that, we had to endure the shortcomings of 2D design. The modifications were not linked together. In other words, changing one drawing did not automatically trigger changes in the other drawings. With BIM, a change to one area alerts the designer to any related areas that need to change as well.

BIM was first used in our water treatment plants, but ordinary BIM does not always have adequate capability to handle roads and bridges. We worked with our supplier to develop a BIM specifically for civil engineering that is perfect for visualizing roads, bridges and tunnels. It can demonstrate our design concepts and offers precision in our presentation, even for minor features.

Could you give some examples?

JW: SMEDI is particularly strong in designing bridges. For instance, the Ganjiang River Second Bridge, in Jiangxi Province, has a “fish-like” design that merges very well with the landscape. The structure is complicated, with the steel above, concrete below, and a mixture of both in the middle. We used our specialized civil engineering BIM, enabling well-planned division of work, with different engineers deployed collaboratively for components, the skeleton and the steel structure.

With our civil engineering BIM, it has become much easier for us to accommodate changes in design, which can be frequent. In the past, making changes to the design often took even longer than the designing itself. Now, the pain of endless modifications is significantly reduced.

Another notable example is the Yanggao South Road Tunnel project in Shanghai, a project involving many tunnels and bridges. Our BIM made design much more precise and easier to visualize.

What is your overall evaluation of the civil engineering BIM solution that your partner developed, based on your input?

JW: We have immensely benefited from this platform. Our partner has long been number one in the field of manufacturing, and we foresee that “manufacturing today is the civil works of tomorrow.”

Furthermore, the platform has saved a significant amount of our time. Before this platform, we spent about one-third of our time doing design work and two-thirds of our time doing communication. Apart from facilitating our design work, BIM makes communication much faster and easier, and this translates into substantial cost savings.

Our BIM platform is specially designed to effectively solve civil design industry challenges. We believe this platform is simultaneously mature and innovative.◆

Wireless power for the IoT

Growth of Internet-connected devices spurs need for new ways of generating power

Lindsay James

6 min read

Powering the increasing number of Internet-connected devices is no easy feat, especially given the expense, inconvenience and, in some cases, infeasibility of wiring remote devices or fitting batteries. New ways of energy harvesting promise to solve the challenges of powering the Internet of Things.

Imagine being able to power a device with a grape. It may sound absurd but, according to John Blyler, an adjunct assistant professor of systems engineering at Portland State University in Portland, Oregon, it is already possible.

“Today, a few tiny grapes can drive ultra-low-power electronics,” Blyler said. “A few years ago, global semiconductor company Texas Instruments demonstrated a grape-powered digital clock. The 16-bit microprocessor in the clock required so little energy that it could run off of a few grapes. The miniscule amount of acidity in the grape, when teamed with a zinc metal contact, created a battery to power the clock.”

The same pH-based chemical reactions that allow several grapes to power a digital clock can also be used to enable crops in the field to power a small radio frequency (RF) circuit that can send a report of soil conditions wirelessly to a farmer’s computer. “This is just the start of how energy harvesting can be used to provide alternative power sources,” Blyler said.


Finding alternative sources of power for difficult-to-wire devices has become a priority for researchers, largely because of the rise of the Internet of Things (IoT), also known as the Internet of Everything. According to Cisco, a global leader in networking technology, the world will have 10 times as many Internet-connected devices as connected people by 2020, with more than 5 billion connected people and 50 billion connected things.


“As we move toward a world filled with a large number of IoT devices distributed all around us, the means to power these devices is becoming a huge problem,” said Vamsi Talla, a doctoral candidate in electrical engineering at the University of Washington in Seattle. “Imagine a scenario where a room is filled with hundreds, if not thousands, of sensors. These sensors are great because they can facilitate a multitude of applications – enabling a smart home or a smart city, monitoring the temperature of soil in a farmer’s field, tracking the flow, level and viscosity of goods in a manufacturing plant – there are unlimited possibilities.”

However, Talla said, providing power to these sensors is a challenge. “Wires are impractical and batteries require constant replacement and add cost, size and weight. This is a big stumbling block. So, if we can eliminate batteries and instead power devices using harvested energy, IoT devices will achieve mass adoption and deployment and, ultimately, fulfill the IoT dream.”

50 billion

By 2020, Cisco predicts the world will have more than 50 billion connected things.

To date, self-powered devices rely on three main sources of energy. “These are kinetic energy – whereby lateral movement, rotation or vibration can generate electrical energy using electromagnetic or piezoelectric harvesters,” said Matthias Kassner, a product marketing director at EnOcean, a company located near Munich that develops patented, self-powered wireless technology. “Thermal energy – close-distance temperature differences – can be converted to electrical energy. (The third is) ambient energy sources such as light, electromagnetic waves, as well as chemical and bioelectric systems.”


Researchers have made significant advances in all three areas. In the field of kinetic energy, for example, Southampton, UK-based Perpetuum, a leader in vibration energy harvesting, has developed the technology to convert vibrations into electrical energy that can be used to perpetually power autonomous, maintenance-free industrial wireless sensor nodes. These sensor nodes are being used on trains to monitor the condition of their bearings – a task previously done manually.

“Today there are fleets of trains across the world using Perpetuum’s vibration energy-harvesting-powered sensors,” said Roy Freeland, the company’s president, who is chairman of the Innovate UK’s energy harvesting steering committee and a member of the European Union’s ZEROPOWER Scientific Advisory Committee.



“I can pull out my smartphone while I’m on holiday and examine, in real time, the condition of the bearings on the 10:37 London to Brighton train. It’s a great example of a high-volume IoT application of energy harvesting.”

Vibration sensors also are being widely used in process manufacturing plants. “Gas, chemical and power stations, including GE’s Bently Nevada, (plus energy equipment companies) Emerson and Honeywell, are using vibration energy harvesters to power wireless sensing systems in their process plants,” Freeland said. “This means that central operations have complete visibility over the status of their equipment, wherever it is in the world.”

Meanwhile, GreenPeak Technologies, a Utrecht, Netherlands-based provider of RF communication technology for wireless connected home applications, has developed a light switch that can operate without batteries. “Just flicking the switch generates enough energy to transmit a signal to a lamp,” said Cees Links, founder and CEO.


In the field of thermal energy, EnOcean has created a technology that can power sensor nodes using small fluctuations in temperature. This principle is already being used by farmers to collect in-field data, including temperature, humidity, soil moisture, pH levels and macronutrients.

Energy can be harvested from almost anything, including the acidity in tomatoes. These devices from EnOcean could power moisture or temperature sensors in remote fields and transmit the data to the farmer’s computer – or to a “smart” irrigation system. (Image © EnOcean)

“Our technology has led to the emergence of self-powered heating valves, which can automatically regulate the heating based on room temperature and occupancy,” Kassner said. “This enables a reduction of heating cost – usually the biggest share of energy cost in private homes – by around 20%-30%, without any user intervention.”

Advances in ambient energy harvesting include the ability to broadcast power to remote devices with Wi-Fi signals, using a modified router that broadcasts noise and maintains constant energy to provide power. “Recently, our group has shown that you can power devices from Wi-Fi access points to a distance of up to 30 feet (9 meters),” Talla said. “For example, we have recently demonstrated the ability to power a surveillance camera using the energy harvested in this way.”

Solar energy is another form of ambient power. Fujitsu in Kawasaki, Japan, for example, has developed a solar-powered beacon that, at just 2.5 millimeters (0.09 inches) thick, can be attached to curved surfaces, corners – even clothing. Meanwhile, two engineers in the electrical engineering department at the Massachusetts Institute of Technology (MIT) in Cambridge, Massachusetts, have developed a small-scale converter chip that can convert up to 80% of solar energy into electricity. Dina El-Damak, a doctoral student, and her professor, Anantha Chandrakasan, point out that this is a substantial improvement over traditional solar cells, which can convert at most half of the solar energy they collect into usable electricity.


While the potential of energy harvesting is promising, experts caution that it’s important to remain realistic about the extent to which these solutions can be applied in everyday life.

“The biggest challenge with energy harvesting techniques is that they provide extremely small amounts of energy,” said Hrishikesh Jayakumar, a doctoral student at Purdue University in West Lafayette, Indiana. “What’s more, the power you get from them is dynamic – it can be in short bursts or a continuous trickle – quite the opposite of what you get from a battery.”

For now, applications are limited to devices that require only small amounts of energy, Kassner said. “Not only this, but they need to implement very efficient sleep modes so that only a very tiny amount of energy is consumed when the system is not active,” he said.

Freeland advises consumers not to believe everything they read. “I recently read an article in a national future our mobile phones will be using energy harvesting to get power from ambient radio frequency signals,” Freeland said. “This is nonsense. Some of the research out there seems to bypass the fundamental laws of physics. If you’re going to make progress in energy harvesting then you need to be realistic. The fact remains that you can only harvest very small amounts of power using these techniques.”

The power limitations of energy harvesting are why GreenPeak has turned its attention to creating new, more efficient forms of battery power. “We spent three years pursuing the energy harvesting market,” Links said. “While there are several areas with great potential, for us it just wasn’t economically feasible. So we’ve instead set about creating a low cost, ultra-low-power battery which has a life expectancy of over 10 years.”


Despite the limitations, new ways of applying harvested energy offer huge IoT potential, and the industry shows no signs of slowing. According to WinterGreen Research, an analysis and forecasting company based in Lexington, Massachusetts, the global energy harvesting market – which was worth US$131.4 million in 2012 – is projected to increase to US$4.2 billion by 2019.

One area where increased adoption seems inevitable is in smart cities. “More than half of the growing world population lives in cities,” Kassner said. “Intelligent control will be needed to coordinate people’s daily life and protect the environment and resources at the same time. The concept of a smart city intends to provide automated control of traffic, streetlights, energy supply or transportation of needed goods as well as waste disposal. This can only be realized with billions of wireless sensor nodes collecting and delivering the data needed.”

Portland State’s Blyler predicts: “We’ll see progress in a number of IoT applications, including everything from smart buildings, where the energy could be generated locally and transferred without wires, to recharging a hearing-aid battery. We’ll also see advances in the wearables space; consumers show a very low tolerance for devices that constantly need to be recharged.”

“Overall there’s a very exciting future ahead,” EnOcean’s Kassner said. “We expect a wave of IoT sensor developments as radical as we’ve seen with computers and mobile phones.” ◆

SMB advantage

How cloud computing is helping smaller companies compete globally

William J. Holstein

5 min read

For decades, the high cost of computer hardware and software put small and medium businesses (SMBs) at a competitive disadvantage, especially as they sought to expand beyond their home countries. But cloud computing, which allows SMBs to access hardware and software on a pay-as-you-go basis, is leveling the playing field.

Arthur Léopold-Léger dreams of creating a global aircraft company in La Rochelle, a seaport on France’s Atlantic coast north of Bordeaux. That his dream is becoming fact speaks volumes about how the emergence of cloud computing has altered the business world’s established order.

Léopold-Léger’s small team of engineers at Elixir Aircraft is designing two-seat aircraft that take advantage of the state-of-the-art technology used in America’s Cup racing and incorporating carbon fiber composite materials made in Japan.

Léopold-Léger predicts that the end result, when sales begin in 2017, will be aircraft that are sleeker, safer and less expensive than anything else on the market. Elixir’s first target is Europe, followed by South America, South Africa, Australia and New Zealand. In the final stage of expansion, Léopold- Léger expects to start selling in Asia and North America.

Tiny Elixir can have such aggressive global dreams because cloud computing enables the firm to tap the software and computing power it needs, when it needs it, from its technology supplier, without massive upfront investments in computer hardware and software. Designing an aircraft, of course, requires significant design and simulation computing power on a robust platform. “But we don’t have to invest in hardware or administration,” Léopold-Léger said. “That’s one employee we don’t have to hire for perhaps €50,000 a year. For a small company like us, that’s a huge percentage of our budget.”

His designers also have the flexibility to use software programs for a finite period of time and then shift to other software tools as needed. “Whenever we have something to do we know we have the tools,” Léopold-Léger said. “And if we don’t have it, it’s one phone call away. We can use it for just a few months and switch it off.”


Elixir is breaking the long-standing paradigm in which small and medium businesses (SMBs) worldwide have been at a disadvantage relative to their larger multinational counterparts when they attempt to expand globally. Smaller companies may be nimble and fleet of foot at home, but relatively few can afford to build the complete information technology (IT) infrastructure required to support robust international sales and operations.

Today, however, after years of often empty hype, cloud computing is transforming the competitive landscape for SMBs worldwide. These companies no longer have to build an expensive IT infrastructure to support inventory, supply chain, customer relations, human resources and other functions required by any modern business, particularly a global one. They have access to collaboration tools that enable them to operate across national boundaries in ways that only large companies could once afford. SMBs can, in effect, rent the services they need, à la carte, when they need them, from giant players that include Amazon Web Services and from smaller, specialized cloud computing providers such as Outscale, which is based in Saint-Cloud, France.

An added plus is that most cloud-based systems help translate currencies and languages at multiple geographic locations 24 hours a day, 365 days a year.


Being able to expand a company’s IT infrastructure quickly and to cherry-pick precisely which functions are needed allows CEOs of smaller companies to take on global risks they might not have contemplated even five years ago. Flexibility in “scaling” – the ability to add or shed resources quickly at low cost – is particularly important for smaller companies because they can’t always anticipate a surge in orders. With cloud computing, companies can achieve scale almost immediately rather than taking months to place orders, receive equipment and hire the right people to make it all work.

“We’re seeing adoption from a bottom-up perspective – smaller companies are adopting cloud solutions before some larger companies,” said Christopher Holmes, Singapore-based managing director, IDC Insights Asia Pacific, and international head for IDC Manufacturing Insights, part of global industrial consulting firm IDC. “It’s affordable and they often don’t have an internal IT team. They can rely on a cloud vendor.” By contrast, Holmes said, larger companies have big, expensive legacy systems implemented over the course of decades, which can make it difficult to migrate those functions to a cloud services provider.


The United States and Western Europe have been at the forefront of helping smaller companies go global using the cloud, but Asia appears to be catching up, said Craig Downing, senior product marketing director in charge of global cloud strategy for San Diego, California-based Epicor Software. Epicor is considered a second-tier cloud services provider; such providers tend to work directly with customers rather than through channel partners.

Asia has lagged, Downing said, because some Asian governments have resisted efforts by Western technology companies to build the technical infrastructure that supports cloud computing in their jurisdictions, due to concerns about transferring data across national boundaries. “But now we are starting to see them start using cloud,” Downing said.

Elixir Aircraft’s first model was designed with hardware and software hosted thousands of miles from its offices near La Rochelle, France, demonstrating how small companies are leveraging cloud computing to compete globally. (Image © Elixir Aircraft and cockpit design firm TWIN)

Cloud computing is advancing for smaller Asian companies on two major fronts. First, many Asian companies are suppliers to large Western companies that insist on doing business via the cloud. “The Asian companies are participating as part of the supply chain of larger customers embracing cloud systems,” Downing said. “If you are a vendor to Raytheon or Boeing or Bosch, you’re being pulled into that customer’s supply chain and you’re interacting directly with their cloud-based ERP systems.”

On a second path, Asian subsidiaries of Western companies are creating locally based cloud operations that link to their parent companies’ global cloud-based systems. “Their operations in Thailand or Vietnam don’t necessarily have the same level of complexity or scale requirements” to justify adopting large, expensive systems from major cloud providers, Downing said. “So we’re seeing them roll out second-tier solutions.”


The use of cloud computing has changed the very nature of how some smaller Asian enterprises function, especially where profit margins are razor thin. One example is Australia- and Singapore-based Ghim Li, a group of companies under the umbrella of GLG Corporation Limited, which supplies textiles and apparel to large US retailers including Macy’s, Sears, Walmart and Target.

Previously, Ghim Li invested a great deal of time and money in sending samples of fabrics back and forth from its factories in Southeast Asia to customers in the West. Quality control inspectors also spent a great deal of time traveling from factory to factory.

In January 2013, Ghim Li implemented a hybrid telephony and audio-video conferencing platform, located both on the cloud and on premise, to better connect employees in disparate locations and communicate fabric designs via video. “We needed a solution that gave us a competitive advantage,” Chief Information Officer Timothy Ngui said. The system, Ngui said, saves Ghim Li tens of thousands of dollars a year.

IDC’s Holmes said he is also seeing smaller Chinese enterprises use the cloud to develop a global footprint, allowing them to leverage the best resources regardless of location. “A Chinese shoe manufacturer can put its design functions in Italy, yet the manufacturing is done out of southern China,” he said. “They can leverage the cloud capability and discuss everything in real time.”


Of course, like any technology trend, cloud computing sparks debates among CEOs about the best way to use it. Some CEOs believe they should retain any sensitive data or information in their on-premise systems to better protect it from hackers, but other CEOs argue that their secrets are best protected by large cloud suppliers, which continuously update their software and have the resources to mount the best digital defense.

Others note that over a five- or 10-year period, relying on cloud services may prove more expensive than building one’s own systems. The vast majority of SMB CEOs, however, must overcome acute short-term challenges; if they fail to seize opportunities quickly, they will be blown away by more agile small rivals or by large competitors with deep pockets.

“The cloud isn’t a silver bullet but it’s pretty close,” said Stéphane Maarek, vice president North America at cloud services provider Outscale. “Not having to invest a tremendous amount of energy, resources and capital in building teams and setting up a commodity infrastructure really helps CEOs focus on their core business activity. They can reinvest money that was originally spent on infrastructure toward achieving their key business goals.”

Bottom line, cloud computing is helping SMB executives achieve global strategies that once would have been inconceivable. ◆

Human 2.0

Tech is reducing humanity’s physical limitations

Dan Headrick

5 min read

Throughout recorded history, human beings have sought to extend their physical and mental capabilities. Today’s technologies – which are beginning to enhance humans with mechanical augmentations and computerized sensor technology – stand poised to alter the human experience in ways previously imagined only in science fiction.

On a warm, rainy evening in June 2015, a young girl visiting the Science Center Berlin at Potsdamer Platz was fighting to keep her balance while walking a narrow path across a plunging virtual ravine. Fun (and safe) as it was, the balancing act in the “Discover what moves us” exhibit helped to demonstrate just one function in a human’s ability to control more than 200 bones and 600 muscles: perception.

Science Center Berlin was established in part to show people like that able-bodied girl how technology assists those who need help walking, balancing or grasping objects. The exhibits also illustrate how technology is bridging the gap between “disabled” and “able” and how science is shifting the line between human limitation and human potential.

Hugh Herr lives with physical challenges daily. In 1982, Herr lost both of his legs to frostbite in a mountain-climbing accident. “At that time, I didn’t view my body as broken,” Herr told a TED Talk audience in the spring of 2014. “I reasoned that a human being can never be ‘broken.’ Technology is broken. Technology is inadequate.”

Today, Herr leads the Biomechatronics Group at the Massachusetts Institute of Technology (MIT) Media Lab, where scientists and engineers are developing innovative technologies that restore and enhance human capability. “From synthetic constructs that resemble biological materials, to computational methods that emulate neural processes, nature is driving design,” Herr said. “Design is also driving nature. In realms of genetics, regenerative medicine and synthetic biology, designers are growing novel technologies not foreseen or anticipated by nature.”


Össur, a noninvasive orthopedics company based in Reykjavik, Iceland, dramatically demonstrated Herr’s predictions when it unveiled a bionic leg that reacts to signals from muscles in the thigh. While similar devices have been demonstrated in the lab, including neural devices implanted into the brain’s motor cortex that allow an amputee to control the movement of a robotic limb with thought, Össur’s is the first such bionic limb available commercially.

Wearable support devices that transfer load away from the back and legs are seeing growth in commercial applications in the workplace, in the home, for recreation and in the military. (Image © Ekso Bionics)

Össur’s breakthrough is just one of many being reported almost daily. Second Sight Medical Products, a California company, has patented a retinal prosthesis system that creates the perception of light patterns in the brain. The system looks like sunglasses wired to a small video processor. A miniature video camera transmits electrical pulses to the brain of a blind person, who learns to interpret them as visual patterns.

Meanwhile, synthetic “skins” are being developed for prosthetic limbs; the skins stiffen and soften, mirroring the behavior of human skin where it attaches to the body. The result is more natural, integrated support, creating flexibility and comfort between the human and the device. Researchers say the same technology can make everyday clothing and footwear more comfortable and functional for all. In fact, companies are developing bionic systems for otherwise able-bodied people – products marketed for human potential, not just for overcoming limitations.


In 2014, Keith Fitz-Gerald, an investment strategist with Money Map Report, described human augmentation as an “unstoppable trend.” In a 2015 update he wrote: “At the time I noted that the human augmentation market will conservatively be worth hundreds of billions of dollars by 2020 ... I think I may have understated things.”

One reason is that companies worldwide are leveraging human enhancement technology to improve the safety and capabilities of their employees. Assembly line workers at Audi’s new facility in Neckarsulm, Germany, for example, have been equipped with powered posture-support devices from the Swiss-based company Noonee. Workers wear the “Chairless Chair,” which takes the weight and load normally placed on an individual by various tasks, to reduce fatigue and the risk of injury to the back and legs.

Other well-known companies are commercializing human augmentation technologies for other work environments. In 2015, Honda Motor Company began leasing its Walking Assist Devices, which supports those with weakened leg muscles, to hospitals and rehabilitation facilities in Japan. Panasonic, Lockheed Martin and Raytheon are developing similar technologies.

Investors are lining up to commercialize an unpowered lightweight lower-leg device with a spring and clutch system that works with calf muscles and the Achilles tendon to reduce metabolic energy consumption while walking. The device was developed by engineers at US universities North Carolina State (Raleigh) and Carnegie Mellon (Pittsburgh). “The market projections are pretty crazy,” said Greg Sawicki, who worked on the project.



Military applications also are booming. Ekso Bionics, for instance, is working with the US military to develop an armored exoskeleton to help foot soldiers walk, run or climb farther and faster and to carry heavy loads over rugged ground. MIT is working on similar systems that assist normal walking for healthy people. “We’re beginning the age in which machines attached to our bodies will make us stronger and faster and more efficient,” Herr said.


With so much activity, futurists have begun to predict where the trend of human augmentation may be heading. Ray Kurzweil, renowned inventor, futurist and director of engineering at Google, told audiences at a New York finance conference in June 2015 that humans will become hybrids in the 2030s, and predicted that our brains will be connected to the cloud via nanobots made from DNA strands. “Our thinking then will be a hybrid of biological and nonbiological thinking,” he said. “We’re going to gradually merge and enhance ourselves. In my view, that’s the nature of being human – we transcend our limitations.”

Such predictions have ethicists and policymakers wringing their hands. Who decides which enhancements go too far? Performance-boosting steroids use by professional athletes: bad. Cosmetic plastic surgery: fine. Vaccines against deadly diseases: great. Military use of “personality pills” and “thought helmets”: weird.

One problem, experts say, is lack of consensus about simple definitions. For example, what is a disability? According to the United Nations, about 1 billion people worldwide (about 15%) are considered disabled. If, however, the definition of disability is expanded to include depression, schizophrenia, autism and other cognitive conditions, more than half of the world’s population would be included.

1 billion

According to the United Nations, about 1 billion people worldwide are considered disabled.

Researchers from the Academy of Medical Sciences, the British Academy, the Royal Academy of Engineering and the Royal Society, who looked only at the impact of human enhancement in the workplace, were troubled by the lack of data they found. Their 2012 report, titled “Human Enhancement and the Future of Work,” posed a number of questions. For example, should employers expect enhanced workers to do more dangerous jobs? Does an “enhanced” individual have unfair advantages over a “normal” job applicant? And who pays for the devices: governments, insurers, employers … or workers?

These questions, ultimately, ask what it is to be human. For Hans-Willem van Vliet, managing director of R&D at Otto Bock Healthcare, this is the premise behind the Science Center Berlin exhibition where the young girl walked the virtual gap – that humans desire “soft solutions.”

For nearly a century, Otto Bock has been making artificial limbs, most of them clunky, cold and devoid of sensory input. But “people are emotional beings,” van Vliet said. The future, he predicted, will therefore lead science into the neural pathways of the human brain to tap the senses where human emotions are born, “to feel the wind, a touch, goose bumps, holding hands.” ◆

The human microbiome

Why the bacteria inside us may be the next medical revolution

William J. Holstein

5 min read

The average human carries 20,000 different species of bacteria in his or her digestive system, weighing an estimated 2 to 3 pounds (0.9 – 1.4 kg). Researchers are beginning to understand how these bacteria, collectively called the human microbiome, are linked to an astonishing variety of ailments – and perhaps to their cures as well.

The average human carries 20,000 different species of bacteria in his or her digestive system, according to scientists. These bacteria, collectively called the human microbiome, have been linked to a variety of ailments, including diabetes, obesity, asthma and autoimmune diseases. They also have been linked to depression and overall mental health.

Scientists can identify the bacteria by decoding DNA using genetic sequencing machines, ushering in the dawn of a new era in understanding human health.

Researchers in the US, Europe and China are racing to put the knowledge to work. The complexity is astonishing. Scientists do not know how the different populations of bacteria interact with each other, and they do not understand the specific mechanisms by which the bacteria impact the health of a human host. They do know, however, that people in different parts of the world have very different collections of bacteria inside them. They suspect that the various bacterial soups may help to explain why some nationalities are prone to diseases that rarely occur in others. And that has many scientists asking: Could we prevent or cure diseases by managing the mix of bacteria in a person’s gut?


As a pioneer in this field and head of the American Gut project, the world’s largest open-source science project to understand microbial diversity in the human digestive system, Rob Knight is one of the world’s leading human microbiome scientists.

“The complexity of the microbiome likely exceeds the complexity of cancer because of the sheer number of genes involved, the number of configurations and the number of attractions among cells,” Knight said. “Your microbes have way more genes than you do, and they have way more connections between them.”

In purely technological terms, one key to making sense of it all is to drive down the cost of deciphering a gene, which means continuing the revolution in DNA sequencing machines.

“The big challenge is really the DNA sequencing,” Knight said. “At the moment, each batch of DNA sequencing is expensive and you need to accumulate a lot of samples to put them into the sequencer.”

BGI, formerly known as the Beijing Genomics Institute, based in Shenzhen, China, is the world’s largest purchaser of DNA sequencing machines. BGI is exploring many genomics subjects, including the bacteria in the digestive systems of newborn children.


The microbiome’s complexity, however, makes the well-chronicled big data challenge look like child’s play. The American Gut project uses DNA extraction protocols from MO BIO Laboratories, a privately held biotech company in Carlsbad, California. It then processes the results either on the University of California’s supercomputer network or on Amazon Web Services, a provider of cloud-computing services.

All of these fields – big data, supercomputing and cloud computing – are undergoing rapid evolution, and all are essential to making sense of the microbiome. “It’s kind of like asking what’s more important for a cake; is it the flour or the eggs or the sugar?” Knight said. “The reality is that you need all of them to be successful.”

To date, some 6,000 people have had their gut bacteria analyzed by the American Gut project, but the complexity of the data is difficult for anyone, even a medical doctor, to decipher.


Aside from technology, scientists who wish to unravel the microbiome’s secrets will have to change the way they conduct research, said David Agus, professor of medicine and engineering and director of the Center for Applied Molecular Medicine at the University of Southern California in Beverly Hills, as well as a leading cancer researcher who is expanding his work into the microbiome.

“My team has physicists, mathematicians, mathematic modelers and biologists all on one team,” Agus said. “We have to do something that was considered heretical a decade ago. We have to converge the sciences. That’s how we’re going to make breakthroughs in fields like this.”

Agus’ team relies on massively parallel computing, an advanced type of supercomputing that uses a large number of processors or separate computers to perform a set of coordinated computations simultaneously.

He argues that it is folly to insist on establishing every data point – for example, how one population of bacteria engages with another – because that is beyond the capabilities of even the most powerful supercomputers for the foreseeable future.

It is better, Agus said, to develop “coarse” theories based on approximations. One coarse theory is that, although no one understands the precise connection, tobacco smoking is almost universally acknowledged to increase the risk of lung cancer. Agus recommends a similar common-sense approach for the emerging microbiome field. “This is a complex emergent system,” he said. “You have to look at it in terms of modeling, not the reductionist approach, to try to understand every data point.”


Although the work is just beginning, some signs indicate how different industries will seek to profit. For example, one medical response to an antibiotic-resistant “super bacteria” called Clostridium difficile (or C. diff), which kills thousands of people a year, is the use of fecal microbiota transplants. A patient is, in effect, given a colonoscopic infusion of fecal microbiota from a healthy donor in an effort to clean out the C. diff.

While this method has been shown to be effective against C. diff cases in the US, the US Food and Drug Administration has not authorized it for any other bacterial imbalance. Some manufacturers have therefore developed pills containing treated fecal material that can be swallowed. The goal is the same: changing the gut’s bacterial balance.

Another exploding field is probiotics. Probiotics are taken orally to build up the number of “good” bacteria. One breakthrough discovered by Karl Seddon, who graduated from the University of Oxford (UK) with a master’s degree in biomedical engineering, allows probiotics to pass intact through the digestive tract until they reach the large intestine. Once there each dose, now the cornerstone of a UK-based company called Elixa Probiotic Limited, reportedly delivers 50 times more beneficial bacteria, compared to the average probiotic supplement. Following a six-day program, Elixa reports the regimen releases 3 trillion units of “good” bacteria into the user’s microbiome, compared with 10 billion with the average probiotic. Because it is considered a food supplement, Elixa’s capsules are available in many jurisdictions without a prescription.


These techniques are blunt instruments, however, compared to the goal of achieving a deep understanding of exactly what is happening in the interactions among different bacteria and the human host.

Over the longer term, the big payoff could come when either pharmaceutical or food companies or both learn how to tweak existing products or develop new ones that impact the microbiome in ways that spur specific health outcomes. Currently, the food industry has a head start because pharma companies, as a rule, try to identify a specific molecule, patent it and then conduct clinical trials to win approval from government regulatory agencies.

SCOTT J. PARKINSON, Head of Gastrointestinal Health and Microbiome, Nestlé Institute of Health Sciences (Image © Greg Lefebvre / Nestlé Institute of Health Sciences)

The sheer mathematical complexity of the microbiome also may require a mixture of different substances to reshape it; a one-molecule solution may not prove effective, said Scott J. Parkinson, head of the Gastrointestinal Health and Microbiome group at the Nestlé Institute of Health Sciences in Lausanne, Switzerland. “People are becoming more and more aware that it isn’t a single bug you’re trying to change,” Parkinson said. “You’re actually trying to change an ecosystem.”

The vast possibilities are easy to imagine. If Nestlé, for example, could tweak its infant milk products – just one of its many product lines – for different types of children in different geographies, the result could be enormous health benefits and large sales.

“Nestlé and other food companies are investing a lot of money trying to figure out the future of the microbiome, as far as product development goes,” Parkinson said. “Maybe you think about products that combine knowledge about lipids, carbohydrates and proteins and target the microbiome at multiple levels to get the ecology we’re looking for in very specific patient populations. Then we could expand it and tailor-make products for various consumers or patients.”

Like Agus, Parkinson is skeptical that scientists will succeed in cracking open the microbiome’s secrets any time soon. At least for now, they will have to rely on the old-fashioned scientific method to formulate hypotheses about what works and then test those theories. The way an individual responds to a particular drug, for example, could serve as a marker that reveals clues about his or her microbiome.

The race is truly global, and it is picking up speed. Many people suffering with untreatable ailments hope answers come fast. ◆

Retooling apprenticeships

On-the-job training lowers unemployment rates by closing the skills gap

Charles Wallace

5 min read

The missing link for closing the gap between unfilled jobs and available skills may be hidden in the age-old idea of apprenticeships: combining education with formal, hands-on training. Each nation employs apprenticeships differently and benefits vary, but the practice is gaining traction, with plenty of room to grow.

US employment statistics encapsulate one of the great conundrums facing companies everywhere in the 21st century: the gap between available jobs and workers with the skills to fill them.

At the end of June 2015, for example, the US had 5.2 million job openings, according to the US Bureau of Labor Statistics (BLS), many of them left unfilled because firms could not find enough workers with the right job skills. Meanwhile, in July 2015, 2.8 million young people between the ages of 16 and 24 were unemployed, a rate of 12.2%, the BLS said.

A similar youth unemployment problem prevails in much of the developed world, and developing countries face an even greater challenge finding enough skilled workers. However, one increasingly popular solution to closing the job-skills gap in many advanced economies retools a relatively old idea: apprenticeships that combine education with formal, on-the-job training.


In the US, the number of apprenticeships has been relatively low and largely confined to heavily unionized sectors such as construction, plumbing and electrical work. To change that pattern, US President Barack Obama announced in September 2015 a US$175 million grant to 46 organizations across the US to develop or expand apprenticeships in high-growth areas.


The total number of apprenticeships available annually in the UK.

“Apprenticeship offers a smooth pathway to the middle class and to a college degree for those who wish to continue their education and training,” Secretary of Labor Thomas Perez said. “I like to call it the other college – without the debt.”

The US is not alone. In the UK, the number of apprenticeships quadrupled between 1996 and 2009, then doubled again between 2009 and 2013, bringing the total number of apprenticeships available annually to almost 800,000, according to Tom Bewick, former CEO of the International Skills Standards Organization (INSSO), a London-based workforce development consultancy. Apprentices are now employed by organizations as diverse as The Royal Opera, Jaguar Land Rover and British Aerospace, a major advocate with more than 1,000 apprentices.

One of the largest programs in the UK, Microsoft’s Partner Apprenticeship program, takes high school dropouts aged 16-18 and trains them in computer skills, for which there are 160,000 unfilled jobs annually. The program plans to train 3,500 apprentices by the end of 2015 at a variety of firms, ranging from large employers to hundreds of small enterprises. “We view this as a massive success,” said Dominic Gill, Microsoft UK’s apprenticeship lead, who helps facilitate the apprenticeship program. “It has a retention and employment rate of about 95%, far above the industry average of 80%.”


The idea of apprenticeships emerged in the Middle Ages, when skilled craftsmen in European guilds trained young people in exchange for room and board, increasing each craftsman’s capacity while ensuring that their skills were passed to the next generation. Although apprenticeships continue to exist in many countries, Germany, Austria and Switzerland have the most developed systems and, not coincidentally, the lowest youth
unemployment rates.

In Germany, an estimated 51% of young people undertake some form of apprenticeship. “Independent studies show that where there are structured apprenticeships, not only is there a net benefit and financial return for employers, but also a net return to taxpayers for their public subsidy,” INSSO’s Bewick said. For every British pound invested in public apprentice programs, Her Majesty’s Treasury estimates the state gets a return of 12 pounds in tax revenues and other benefits, such as productivity increases.

Apprenticeships exist in many countries, but Germany, Austria and Switzerland have the most developed systems and the lowest youth unemployment rates. (Image © Goodluz / iStock)

One big difference between the German-speaking countries and some English-speaking nations is the age at which apprentices are recruited. In Germany, students are often streamed into vocational training and on-the-job work programs at about age 15 – about the same time that students in the US and Britain are beginning secondary school. There, apprentice ships generally don’t begin until after high school graduation, when most students are approximately 18.

One reason is that, in the US and Britain, schools are more oriented toward educational attainment than career achievement, said Robert I. Lerman, an institute fellow in the Center on Labor, Human Services, and Population at the Urban Institute, an economic and labor policy think tank in
Washington, DC.

“We want everybody to have a chance to go to college, even if that might not be the optimal route for many people who learn better in other ways,” Lerman said. He notes that graduation rates from community colleges, which traditionally have provided technical and vocational training in the US, remain troublingly low at 20% of full-time students in the first three years. Apprenticeship programs, he said, offer a viable alternative for many students.


Meanwhile, employers are seeking apprentices with a high degree of job-related educational attainment. Britain has three levels of apprenticeship; the highest level, sought by firms such as British Aerospace, requires passage of four advanced qualifying exams. Apprenticeships take one to four years to complete, depending on their level.



In Germany, a relatively new program called AusbildungPlus (TrainingPlus) offers a dual-track system in which apprentices work for a firm while earning a university-level engineering or business administration degree, up to the doctoral level. “The main benefit is that it becomes easier to access university education when starting out as an apprentice,” said Samuel Mühlemann, a professor of human resource education and development at the Munich School of Management, Ludwig-Maximilians-Universität München. “The program allows firms to educate their future engineers exactly to the needs of the company, since apprentices will have several years of work experience by the time of graduation.”

Another country with an innovative approach to apprenticeships is Australia, which has even more apprentices per 1,000 workers than Germany. One approach used successfully in Australia is known as Group Training Organizations (GTOs), which serve as intermediaries between companies and state/regional governments.

Nicholas Wyman, CEO of the Institute for Workplace Skills and Innovation, operates a GTO called WPC Group in Australia that trains more than 500 apprentices with more than 200 host employers. Wyman published a book about the program called Job U: How to Find Wealth and Success by Developing the Skills Companies Actually Need. The GTO firms play an important role, Wyman said, by assuming responsibility for the apprentices: paying their salaries, handling the paperwork and providing uniforms while mentoring them and tracking their progress. The companies the apprentices work for fund the GTOs.

“You can’t rely on the government to support the apprenticeship programs,” Wyman said. “That’s why the mentored apprenticeship model, where the employer is the main financial contributor, works very well.”

With governments around the world pouring money into financial assistance to fund apprenticeships, many experts believe the idea has reached a tipping point. With fewer than one in 10 of the firms in the UK employing apprentices – and even fewer in the US – there is plenty of room for these programs to grow.

“The apprentice route should not be seen as the soft option people are forced to go for because they can’t get into college,” Bewick said. “There is too much of a sense that there are two systems, with the university route offering higher paying jobs. The apprentice route is geared toward occupations that need different learning styles than formal university education.” ◆

Alan Mulally

Former CEO of Ford and Boeing Commercial Airplanes says all companies are becoming digital, connected

Tony Velocci

8 min read

In a 45-year career, Alan Mulally helped engineer the first commercial airliner built without physical prototypes and then led both Boeing Commercial Airplanes and Ford Motor Company. Now a board member for Google and Carbon3D, the high-energy Mulally shares his enthusiastic perspective on how the futures of traditional and new-age industries are converging.

Alan Mulally is not your average retiree. At 71, he’s one of the very few business leaders who has had the opportunity to change the business world twice: At Boeing, where he was CEO of Boeing Commercial Airplanes, he pioneered the use of a new generation of computer-aided design (CAD) software that revolutionized manufacturing; then, before retiring as president and CEO of Ford Motor Company in mid-2014, he led the 112-year-old automobile maker’s turnaround from a US$17 billion loss at the end of 2006 to profitability by 2008 – without the aid of government bailouts.

Today, Mulally is still very much in demand for his talent, his ideas and for knowing how to gain competitive advantage through organizational culture, serving on the boards of two high-tech companies. When he speaks people listen – and his insights about the future of business might be best described as provocative.

US$17 billion

Ford Motor Company lost US$17 billion in 2006. In 2008, after two years under Alan Mulally’s leadership, the automaker turned a profit.


For example, Mulally believes that the distinction between well-established companies that participate in more traditional industry sectors – ones that existed long before the start of the dot-com era – and so-called “new economy” companies heavily involved in the technology sector will become increasingly blurred.

“I find the whole discussion about digital versus non-digital companies very interesting, but it’s just not true,” he said in a wide-ranging interview exclusive to Compass. “Digital technology, the Internet, information processing and the ever-improving quality and miniaturization of sensors and robotics will enable the quality, productivity and transformation of all industries around the world. All companies will be brought together by databases and systems thinking. Individual companies simply will need to decide which things they are working on to add value in which industry. The enabling technologies will be exactly the same.”



The most important lesson that business leaders can learn from broad societal and business trends, he said, is the power of operating systems that deliver connectivity. “Information is going to be ubiquitous, and everybody around the world will have access to it,” Mulally said. “Can you imagine what’s going to happen when people get a chance to access that information and work together to create even more value for all of us? Embracing the integration of hardware, software, sensors and systems absolutely will be the key to the future for everybody.”


As one of the first executives to bring a traditional heavy industry into the digital age, Mulally sees a parallel between the digital-driven transformation he discerns on the horizon today and what he experienced 25 years ago, when he employed the power of digital technology to break the decades-old paradigm of how commercial airplanes were designed and built.

The year was 1990. Boeing had kicked off development of the “Triple Seven” jet from scratch in 1988, and Mulally was director of engineering. Prior to the 777 model, engineers created physical parts from two-dimensional drawings on paper. Specialists in various departments would design them, but it was up to manufacturing to figure out how to produce and assemble them. Testing form and fit was impossible until the first physical prototype was built.

In Boeing’s previous development of the 767 model, the paper-to-prototype approach had required about 13,000 individual design changes to the door assemblies alone. Management knew that if Boeing was going to deliver the capabilities and quality its customers wanted and complete the even larger 777 project on time and on budget, the company’s approach to designing and building planes needed to change radically.

Boeing had been following trends in computer-aided design and computer-aided manufacturing (CAD/CAM) technologies, in which Mulally had taken a special interest. “We knew that if we could find a breakthrough in how parts are created, so they could be assembled quicker and more easily, we could make a tremendous improvement in the quality of the end product and in our productivity,” Mulally said.

In Boeing’s global search for solutions, Mulally and his team tapped Dassault Systèmes, the developer of CATIA (and the publisher of Compass), which enabled engineers to simulate the assembly of any product – an aircraft, an automobile – in three dimensions in a computer, prior to actual physical production. This capability allowed engineers to verify that all the parts would fit properly before they were manufactured. Theoretically, by virtually eliminating the trial-and-error approach to building and fitting parts, the software could save Boeing substantial time and money.

But could the software be scaled up to simulate and build an entire airplane? Mulally and Bernard Charlès, who was Dassault Systèmes’ president of strategy, research and development in 1990 and is now the company’s president and CEO, thought the case for designing and virtually pre-assembling parts in three dimensions – bypassing paper blueprints altogether – was so compelling that it was worth the risk.

To prove their point, Boeing built a mock-up of the 777’s nose section. The test verified the concept and demonstrated that CATIA could be scaled up to simulate an assembly of the entire airplane. The test was so successful, in fact, that all planned physical mock-ups were canceled.

“Unheard of – never been done!” Mulally exclaimed with as much excitement and pride as if the achievement happened yesterday – not 25 years ago. “It was probably one of the biggest single improvements in the design and manufacturing of airplanes in the last 100 years.” The gamble was such a success, United Airlines accepted the very first production airplane virtually defect-free and on time, a remarkable accomplishment for such a large, complex engineering effort.




During his 37-year career at Boeing, starting as an aerodynamicist fresh out of the University of Kansas, Mulally held engineering and management positions on design teams for every Boeing commercial jet aircraft. He hoped he would have the opportunity to help design one more, but fate had other plans: a telephone call in 2006 from Ford President, CEO and COO William Clay Ford Jr., great-grandson of Ford founder Henry Ford, who offered Mulally the chance to lead the company his family had run for generations.

Mulally’s first questions to Ford centered on the extent of the company’s problems. All of the major US automotive producers were in dire straits, caught in a perfect storm of poor business strategy, eroding market share, too few fuel-efficient models, a worsening credit crisis and lackluster execution. Leaving Boeing, Mulally admits, was a wrenching decision. “The reason I decided to accept Bill Ford’s offer was that I felt like I was being asked to help transform and save a second American icon, and that idea was compelling to me,” Mulally said. “To get a chance to have Ford profitably growing with the best product line in the world was very exciting.”

When Mulally was put in charge, the company was on the verge of bankruptcy. But he had a vision for rallying the workforce around an overarching goal: focus on the Ford brand and become best-in-class with a complete family of cars, trucks and utility vehicles serving all markets worldwide.



Despite his stellar track record at Boeing, Mulally was widely perceived as an outsider in Detroit – an airplane guy, not a car guy. Combined with the daunting task he faced, industry observers in both the automotive world and the aerospace world openly doubted whether Mulally would succeed.

However, he brought the same obsessive focus on motivating the entire Ford team to work together as he had done at Boeing – managers, production people and suppliers. By the time Mulally left Ford, the company had achieved one of the most extraordinary corporate turnarounds in the modern history of business.


Retirement doesn’t sit well with the high-energy Mulally, however. Soon after leaving Ford, he joined Google’s board of directors. Mulally said that the search giant, which makes almost any information available in a nanosecond and is moving aggressively to help build a global Internet of Things, shares his passion for cutting-edge ideas, rigorous accountability and a strong belief in the power of democratized information and computation. They also share a passion for talented and motivated employees, working together, and contributing to economic development, energy independence and environmental sustainability.

But continued advances in global connectivity aren’t the only technologies that Mulally believes will upend business and change the way the world’s populations live. Additive manufacturing, also known as 3D printing – a process of fusing layered materials to create complex three-dimensional objects – also sparks his excitement, which makes his position on the Board of Directors of Carbon3D, a pioneer in 3D printing technology, a natural. “Imagine being able to make products when you need them instead of having warehouses full of inventory,” Mulally said.

In 2004, in an animated display of his true-to-form enthusiasm, former Boeing Commercial Airplanes CEO Alan Mulally visited with employees who operated the hangar where the B737 commercial aircraft were painted. (Image © The Boeing Company)

He also expects big data to be a disruptor. As tools and processes are developed to analyze massive amounts of information in real time – from the consumer all the way through manufacturing – companies will be able to improve not just the quality of goods and services they deliver to customers, but also boost their productivity, he said.

Mulally foresees virtual reality (VR) extending the digital revolution further into the physical realm and deeper into everyday life. VR has the potential to be even more of a problem-solving tool than it is now – think of applying the technology behind flight-training simulators to the world at large. Predicts Mulally: “Virtual reality will be applied to more industries with greater precision.”

But it is the ubiquity of information and access to it that Mulally thinks will have the greatest overall impact. “Everything will be connected, and that is the essence of what this world is going to look like,” he said. “The whole idea will be to make products that improve the quality of people’s lives and do it in less time and with fewer resources. That’s where this digital Internet revolution is going.” (See “Beyond the Internet of Things” starting on Page 52.)


As for the measure of companies’ success, Mulally said it starts with understanding the purpose of a business. “If the leader is clear about what the vision is for the product or business, and that person helps everybody understand what the strategy is for achieving that vision, and if you’re reviewing the status against the plan in regular review meetings, it’s possible to achieve an unbelievable cultural change in how people work together to create a profitably growing company based on great products and great services.”

To Mulally, leadership is about authenticity, getting things done, and the way you treat people. Most important is that leaders hold themselves and their teams accountable for coming together around a compelling vision, comprehensive strategy and plan, and adopting a relentless implementation process with clearly defined “working together” behaviors. The latter – which are at the heart of Mulally’s management philosophy – include principles such as putting people first, including everyone and setting clear performance goals. “That is the essence of leadership, and it really served all of us well at Boeing and at Ford. It was an absolute honor to serve both.”

When Alan Mulally left Boeing to become CEO of Ford in 2006, the venerable US automaker had a US$17 billion loss. Just two years later, Mulally’s laser focus on vision, strategy, honesty and respect had returned Ford to profitability. (Image © Chris Hondros / Getty Images)

Such a management style, paired with his trademark humility, has earned Mulally almost universal praise from the broader business community. Tom Captain, vice chairman and US and global Aerospace and Defense leader, Deloitte, for example, characterizes Mulally as “the gold standard” of business leaders.

“He uniquely possesses the X factor that characterizes an extraordinary leader,” Captain said of Mulally. “His superb ability to motivate and connect with customers and employees is topped off with his positive outlook and boundless enthusiasm. He is one of a kind.” ◆

Tony Velocci is retired editor-in-chief of Aviation Week & Space Technology magazine and a
frequent contributor to Compass.

Left to your own devices

‘Bring your own device’ is taking hold at companies worldwide

Lindsay James

5 min read

Companies around the globe are embracing the bring-your-own-device (BYOD) movement, in which employees are free to use their personal devices to access corporate systems. The goal is to improve access to information, facilitate greater collaboration and increase productivity. While some organizations are reaping the rewards, however, others are paying a hefty price.

Hailed by David Willis, vice president of US-based consulting firm Gartner as “the most radical change to the economics and the culture of client computing in business in decades,” the bring-your-own-device (BYOD) phenomenon is being embraced by enterprises around the globe.

“Today, over 70% of global companies have a BYOD policy allowing for some access to business systems from personal devices,” Willis said. “By 2017, more employees will have access to enterprise systems through their own personal devices than the number of employees with corporate-issued devices.”

It’s easy to see why BYOD has become popular. “Allowing employees to use the tools that they’re already comfortable with can only be a good thing – they get to take the productivity that they’ve achieved in their personal lives into their business lives,” said Marcus Lane, product marketing lead for Enterprise Mobility Management at multinational computer technology company Dell. “What’s more, embracing BYOD positions a company as a flexible, go-getting business. This is important when you consider that many prospective employees today are assessing companies based on what technology they use.”

However, an online survey conducted by CompTIA, a US-based nonprofit IT trade association, found that 53% of private businesses in the US prohibit BYOD, a significant increase from the 34% that prohibited BYOD in 2013. Of 375 IT professionals surveyed, just 7% said they allowed a full BYOD policy where the company takes no responsibility for the devices.


Global steel, energy, infrastructure and services conglomerate Essar, headquartered in Mumbai, India, is on the fourth iteration of its BYOD initiative, however. “The average age of our employees is 26-30 years, and they really value the ability to use the device of their choosing,” said Jayantha Prabhu, chief technology officer at Essar, which has operations in more than 29 countries and employs more than 60,000 people.

“What’s more, because they already understand their device, they are less likely to need our technical help desk,” Prabhu added. “Users have maximum flexibility for both professional and personal computing since they are one and the same. As a company, we are benefiting from a more agile workforce. Travel, relocation and office expenses are also reduced.”

Global conglomerate Essar has found that its employees are less likely to need technical help if they BYOD to work. (Image © Essar)

Ireland’s national airline, Aer Lingus, is also experiencing success with its BYOD implementation. “Having an effective BYOD policy has transformed Aer Lingus’ business,” said Patrick Irwin, EMEA field product marketing manager at Citrix Systems, a US-based software company and an Aer Lingus provider. “The company had 4,000 members of staff who previously had to go to a terminal or office in order to access information. Now, with an effective BYOD policy, staff have secure and flexible access to information anytime, anywhere and from any device. This empowers Aer Lingus employees to deliver great customer care in all areas of the business, from baggage halls to the cockpit. Efficiency has improved as a result, with more planes taking off on time.”


Despite the growth of the BYOD phenomenon, the practice is not all smooth sailing. While the promise of lower costs is appealing to many organizations, the reality for others, according to David Schofield, partner at telecommunications management firm Network Sourcing Advisors in Atlanta, is that BYOD is more expensive in the long run.

“We’ve recently had to help a technology company with 600 workers that went US$300,000 over budget in the first year of its BYOD program,” Schofield said. “When we consider what US$300,000 represents, it is approximately US$41 per (telephone) line, per month. That is a pretty significant impact.”

Part of the problem, Schofield said, was the program the company used to help employees purchase their devices.

“The client had a stipend that was more than twice what the enterprise would have paid for a corporate-liable device,” he said. “Under a corporate agreement, the device may have been free. The enterprise had also purchased a mobile device management system which was not cost effective. Finally, the enterprise help desk went from supporting two devices on the same operating system to a large variety of operating systems and devices, each of which had their individual quirks interfacing with their organization.”

Effectively managing the security aspects of such an initiative is another pain point. According to Dell, 50% of its customers with a BYOD policy have experienced a security breach.

“Security is undoubtedly one of the biggest challenges in implementing a BYOD initiative,” Lane said. “It’s important to protect corporate data to avoid a breach of security, which could have huge implications for a business with a lot of sensitive information.”

Irwin agrees. “It’s really important to have the measures in place to encrypt data,” he said. “Organizations need to be able to wipe corporate information if a device gets lost or stolen, for example.”

While security is of utmost importance, it’s also essential for businesses implementing a BYOD initiative to reassure employees that the company won’t use its connections to access personal information stored on employees’ devices. According to Gartner’s “Predicts 2014: Mobile and Wireless” report, one in five BYOD initiatives will be considered a failure by 2016 due to privacy concerns. “An intended benefit of BYOD is that employees only need to carry one device,” Schofield said. “But we’ve found that many employees continue to carry two because they are afraid that the company may not only access their personal information, but lose or delete it.”


Finding the right balance between security and flexibility is essential to success. “Security of company data and employee privacy concerns are valid issues, but these are both solvable with the right policies and tools,” Gartner’s Willis said. “The key is that the policy clearly states the rights of both employee and employer, and that the employer holds itself to the standard the policy sets. If an employee doesn’t want to conform to the policy then they should have the option to opt out, at which point the employer might choose to provide a device if it is essential to the job.”

BYOD is one practice where a few walls are well advised, Lane said. “Our approach is to implement software via a mobile app, which effectively creates a walled garden of access for company information. Employees enter a password to enter the work environment, which is controlled and managed by the enterprise. Everything outside of that is personal – the company has no access to it. Having this clear-cut division is imperative.”


Overall, Citrix’s Irwin believes there’s only one route to take when it comes to BYOD. “Look around you and see just how many employees bring their devices into work – it’s happening everywhere, in almost every company,” he said. “Companies can think they can avoid BYOD, but the fact is that they’re just putting their company at risk if they do. There’s no getting away from it – it’s imperative to today’s business environment, and those that embrace it and use it to their advantage will reap real rewards.”



In fact, Willis suggests that forward-thinking organizations will use BYOD to differentiate themselves and reach new levels of efficiency. “There’s an opportunity to add new mobile applications which can be used by all employees,” he said. “For example, collaboration tools like secure file sync and share capabilities will make groups more productive. Today, people are using their mobile devices in business far more than they ever did in the PC era. It’s just beginning.”

However, Schofield isn’t so confident. He believes organizations need to tread carefully to avoid losing money. “If the consumerization of enterprises for wireless services continues, we see a race to the bottom,” he said. “Carriers make their margins selling full-price devices upfront to subsidize lower long-term monthly charges. Should the device break, then it doubles its margins, because without a device there is no communication.” ◆

Managing by metrics

Information management becomes mining’s greatest tool

Dan Headrick

3 min read

Even the world’s most technologically sophisticated mining companies are coming to understand that people, equipped with the right information, hold the keys to success.

In Nunavut, Canada, about 2,600 kilometers (1,616 miles) northwest of Toronto, a massive hauling truck broke down recently at the Meadowbank gold mine, threatening to cut production by 5%, which could have cost the company millions of dollars.

But the mining company, Agnico Eagle, recently incorporated new mine planning and ore production software.

Instead of spending several weeks planning using old tools, the company simulated various life-of-mine plans, tested their impacts on production and shifted operations in just days.

“Having tools like these provides a smooth workflow that allows you to look at various options and evaluate several differences in designs,” said Eric Ramsay, senior mine geologist with Agnico Eagle.


A growing chorus of industry leaders is calling for more mining companies to invest in lean business processes, as Agnico Eagle has. The industry’s future, these leaders say, depends on how well mining companies coordinate dynamic information across complex operations.

“Lean” refers to a manufacturing process Toyota refined in the 1970s and ’80s and that spread worldwide to transform modern manufacturing operations. Lean embraces simple concepts that never stop evolving, largely because they depend on every employee having just the right information at just the right time to function at peak efficiency.

“It is important for mining companies and practitioners of lean to make a clear distinction between lean thinking and automation,” said Paul Smith, director of Shinka Management, an Australian consulting firm that helps mining companies develop lean business practices. “Lean manufacturing has traditionally shunned expensive, high-tech solutions to process management, instead opting for low- or no-cost improvements wherever possible.”


Mining is a volatile business that traditionally operates on boom-or-bust cycles subject to the swings of nature, local politics and share prices. New technology improves production, but industry experts agree it’s not enough. Not just operational practices, but also business models, must be lean to eliminate waste, smooth out the peaks and valleys of production, avoid accidents and adjust to the always-present unforeseen.

“Open pit is easier to automate, but the underground mine is like a mini-city,” said Mike MacFarlane, a 35-year mining industry veteran who, as a consultant, urges companies to think differently. “Every day the road changes; all kinds of complex decisions have to be made along the way. The biggest lever point is engagement with employees.”



Therein lies the opportunity, and it’s not just earth-moving operations that must be managed. Governments and communities must also be engaged. In 2014, in a presentation to socially responsible investment analysts, Mark Cutifani, CEO of London-based Anglo American, one of the world’s largest mining companies, said that “approximately US$25 billion of mining projects are delayed or on hold due to sustainability and stakeholder issues.”


Cutifani co-chairs a working group of the Kellogg Innovation Network (KIN) that is trying to change the way mining companies think about how their businesses operate in a larger ecosystem of ecological, business and cultural interests.

“The industry is constrained by conventional thinking,” Cutifani said in a white paper that came out of KIN deliberations. “We need to do this outside of the normal industry structure by drawing ideas and experience from outside of the industry. We want to lead the industry and set the pace for change.”

Some companies are already working to retool their business models. British-Australian multinational metals and mining giant Rio Tinto Group, for example, launched an industry-first operation at its new Analytics Excellence Centre in Pune, India, to analyze massive volumes of research, productivity and sensor-collected data from the company’s fixed and mobile equipment, aiming to predict and prevent downtime and improve safety and productivity.

“The centre will help us pinpoint with incredible accuracy the operating performance of our equipment,” said Greg Lilleyman, Rio Tinto’s chief executive of technology and innovation. Rio Tinto is headed by Sam Walsh, a former auto industry executive and passionate lean business proponent who is focused on mining big data to squeeze greater profits from the business and using computer simulations to predict and plan its future.


At the Meadowbank open-pit gold mine in northern Canada, which has recorded 1.2 million ounces of proven and probable reserves since it opened in 2010, Agnico Eagle is demonstrating Walsh’s point. The mine has saved US$1.4 million and increased output by 70,000 troy ounces in recent months, just by managing information more smoothly to achieve efficient dump design.

It’s exactly the kind of outcome Taiichi Ohno, the Japanese industrial engineer and father of lean manufacturing, envisioned in the late 1940s as he observed that communicating information correctly was key to the smooth operation of modern supermarkets in the United States.

Ohno took his ideas back to Toyota and changed the world. Armed with the same concepts and new information tools, the mining industry today has a similar opportunity to transform itself for the future.

See how Dundee Precious Metals mines with metrics:

Open innovation

Ideas from outside an organization can reinvigorate business innovation

Allan Behrens

4 min read

Like individuals, companies can develop tried-and-true ways of tackling challenges and evaluating opportunities. No matter how successful, these “standard operating procedures” also come with blinders to new ideas and even better ways of working. Involving outsiders – a process known as “open innovation” – can help to reinvigorate innovation, reduce risk and increase speed in identifying and achieving solutions.

Two heads are better than one. It’s a well-known concept that explains why brainstorming sessions are so popular in modern business. Apply more minds to a problem and you’re more likely to find an innovative answer to a tough challenge.

Open Innovation, a concept first described by Henry Chesbrough, adjunct professor at the University of California Berkeley’s Haas School of Business, in his 2003 book Open Innovation: The New Imperative for Creating and Profiting from Technology, applies the same idea not just to individuals, but to companies as well. Also known as external or networked innovation, open innovation allows companies to move from a closed innovation process involving only internal resources to one that innovates by involving external resources.

“Open innovation is dramatically changing today’s business world,” said Sandy Carter, general manager for IBM Ecosystems and Social Business Evangelism. “It’s not just for our customers, but also for us, IBM, internally.”


By supplementing the brainpower available inside one company with input from partners, customers, consultants, government laboratories, crowdsourcing and more, the options for increasing a company’s inputs and improving its chances of finding a truly innovative solution expand exponentially. Expanding their available brainpower is one major reason that companies acquire other companies and form strategic partnerships – to bring new ideas and new ways of working into an established culture.

“Companies can no longer afford to do it on their own,” Martin Curley, vice president and director of Intel Labs Europe, wrote in “The Evolution of Open Innovation,” his “Letter from Industry” to the Journal of Innovation Management. “Indeed the unit of competition is changing in that it is often no longer about how good an individual company or organization is but the strength of the ecosystem in which they participate is often the differentiating factor for great success, mediocrity or even failure.”

But why has a 12-year-old idea become so relevant today? One reason is the changing nature of competition.

“The historical, kaizen-like methods of iterating innovation simply aren’t enough to defend today’s business from the radical thinking and clearly differentiating new entrant,” said Gary Barnett, CTO of UK-based not-for-profit environmental startup AirSensa.



A second reason: The advent of social networks (both internal and external), social-listening tools, dashboards, unstructured data search, crowdsourcing, big-data analytics and the like have made it faster and easier to solicit, collect, analyze and act on inputs from partners, customers and the general public. Armed with these tools, open innovation with almost anyone, anywhere has become both possible and affordable.


Social media, in particular, has thrown the channels of communication wide open, making it easier than ever to listen to and act on what customers and partners are saying. Used internally, social media-style tools help employees locate in-house experts or float ideas and get feedback from across the company. As more conversations take place, particularly among people who don’t normally interact, innovation bubbles to the surface more easily and more frequently.

Companies need to recognize, however, that these tools only work in a culture that encourages free thinking and free speech. If people think they’re being censored, excluded or judged, they won’t participate. Participation also must be transparent. All feedback has value; no one’s input should be shut down.

One way to jump-start involvement in organizations unaccustomed to open innovation is to organize competitions, such as IBM’s facilitated “hackathons,” Samsung’s Open Innovation Centers, Nokia’s Open Innovation Challenge, Philips’ Innovation Fellows Competition and GE’s Ecomagination Challenge. Maker movements, fab labs, innovation networks, idea hubs and crowdfunding platforms offer additional opportunities to gather inspiration and to encourage employees to stretch their innovation muscles.


Any company that dedicates itself to an open-innovation approach, however, needs to be prepared for certain common stumbling blocks. Professional innovation enablers know this situation very well. “Internal culture, for example, can be a principle challenge in maintaining an innovative environment,” said Martin Duval, president and COO of French-based open innovation services company Bluenove.

Internal innovators and influencers are critical assets, but years of experience, successes and failures can lead to prejudices and resistance. IBM’s Carter tells the story of a non-US company that sent a team to study open innovation in Silicon Valley companies. The team returned from its visit, brimming with ideas for improvement – and met a stone wall of resistance. A year later, frustrated by obstacles and disinterest in their home organization, every single member of the team had resigned.

“We need to listen to everyone that’ll talk and talk to everyone that’ll listen,” Barnett said. Listening to an organization’s youngest members is especially critical. “The young help us avoid falling into the trap of always listening to those that are as wrong as we are.” Generation Y, the most tech-savvy generation in history, often sees possibilities that experienced experts might not consider – the types of possibilities that enable radical change.

Ultimately, the best way to create a culture of innovation is to be open to the new ideas of open engagement, as well as the platforms and methods that enable them. Modern technologies make open innovation easier than ever before. What is most important, however, is for forward-thinking businesses to look at opportunities with fresh, open eyes.

Allan Behrens is the managing director at Taxal, which provides analysis, advisory, research and business services for IT users, product and service companies. He is on the board of the UK’s Manufacturing Services Thought Leadership Network and has served as a past chairman of the Engineering Solutions Forum and vice chairman of the Computer Suppliers Federation, encouraging more profitable exploitation of technology within European manufacturing.

Infrastructure innovation

Collaborative and efficient design platform simplifies civil engineering projects

Nick Lerner

3 min read

To flourish, growing populations need more and better infrastructure – the roads, bridges and other public facilities created for government agencies by civil engineers and construction companies. Costly overruns are typical in such projects, but experts agree that many challenges can be overcome through enhanced stakeholder collaboration.

By providing societies with infrastructure, including water, transport, communications, energy and waste systems, civil engineering projects help communities to function, develop and grow. But much of the world’s infrastructure is inadequate and crumbling, and growing populations will only need more of it.

According to McKinsey, a New York-based consulting firm, global investment in infrastructure needs to increase by 60%, an increase of US$57 trillion worldwide, between now and 2030, just to support projected economic growth. But more spending is not the total answer; McKinsey estimates that improved civil engineering productivity could save US$1 trillion annually, putting pressure on the civil engineering industry to embrace efficiency.

“When compared with constructing individual buildings or developments, large-scale civil engineering projects that underpin national economies are uniquely complex,” said Mark Hansford, editor of New Civil Engineer, a magazine for civil engineering professionals. “And because projects are taxpayer funded, they are often under rigorous public scrutiny.”

The Institution of Civil Engineers, an international professional association based in London, offers another reason for increased scrutiny: Declining public financing for civil engineering projects will require that private money supplement public investment, and private backers are skilled at squeezing every penny of waste out of a project. As a result, increased demand for on-time, on-budget projects, coupled with a new focus on profits, is forcing the civil engineering industry to quickly adapt.


Inefficiency is common in civil engineering projects because “outdated and fragmented business structures, leading to a lack of knowledge sharing among professionals and the public, compound problems on many large and complex civil engineering projects,” said Andrew Hawes of Hawes Associates, a UK-based flood defense consultant to national and regional governments.

Hawes is responsible for sea defense levees protecting billions of dollars’ worth of land, crops, businesses and homes from flooding. “On projects of this importance, with complexity, scale and risk necessarily high, contractors and stakeholders need a unified, real-time expert knowledge platform for design, engineering and operations information,” he said.

While such platforms are common in manufacturing, Hawes said, they are rarely used in civil engineering. “New collaborative commercial models would help reduce waste and rework by integrating and unifying on- and off-site project teams and fabricators,” he said.


Multidiscipline, collaborative platforms also can help civil engineering projects attract much-needed capital, according to Matthieu Favas, editor of, which reports on the flow of private capital into infrastructure projects. “Investors demand openness and transparency,” Favas said. “They know they can reduce risk by ensuring that project information is available to all parties in formats that serve them best.”

Superior information, Favas said, also can provide an important competitive advantage. “Investors want to see rapid completion so they can beat competitors to revenue streams. They encourage knowledge sharing to achieve collaboration with governments, contractors and the public.”

In fast-growing China, efficient infrastructure creation is a major focus not because of pressure from investors, but due to the need for speed, said Lv Wei Zhang, association chief engineer of the Shanghai Municipal Engineering Design Institute’s (SMEDI) IT Center. “One of the factors that accompanies China’s rapid development is the adoption of a ‘three parallels’ practice,” Zhang said. “This means that the three processes of planning, design and construction are operated in parallel so that, halfway through the design process, the construction phase is initiated.”

The only way to manage a project that is changing as it is being built, Zhang said, is with sophisticated management systems. “With our civil engineering BIM solution, it has become much easier for us to accommodate changes in design, which can be frequent, by using knowledge- and template-based design techniques. Now, the pain of endless modifications is significantly reduced.”


In Saudi Arabia, where Newtecnic, a UK-based building engineering company, is currently design engineering the façade of the new, Zaha Hadid-designed metro system in the King Abdullah Financial District, collaboration drives the need for sophisticated civil engineering systems. The multibillion-dollar transport hub at the business heart of Riyadh will serve a major interchange with the city’s six new metro lines and provide access to the city’s monorail network, as well as to the King Khalid International Airport and outlying universities.



The project uses a shared information system to align objectives among diverse groups. “It ensures that all parties are connected to, and can simultaneously collaborate around, current designs and associated information in meaningful ways,” said Andrew Watts, Newtecnic’s founder.

The next logical step for large infrastructure projects, Watts said, is “deploying technology that reduces fragmentation, unifies documentation and safely speeds up design and construction by introducing concurrent, flexible and collaborative working methods. Turning data into accessible knowledge delivers better operational efficiency and financial returns for civil engineering and infrastructure investors and stakeholders while enabling enhanced economic and social benefits for everyone.”

Personal shopping

New technologies are giving consumers more customized experiences

Jacqui Griffiths

7 min read

Consumers expect customized products and services that reflect their individual identities – and brands are harnessing innovative technologies to deliver them.

Today’s consumers are connected to each other and to their favorite brands more closely than ever before. Everything, from their smartphone apps and social media activities to the way they shop, reflects a unique blend of choices and preferences. In this connected world, a customized experience is no longer something to aspire to – it’s essential.

“Mass customization is a trend we see developing globally, with advancements coming from all world regions,” said Stacy Glasgow, consumer trends analyst at Mintel, a market intelligence agency with offices in major cities worldwide. “It is driven by the complex, multifaceted self-identities that prevail today; consumers have come to demand products and services that fit their unique personalities and different needs.”

Leading brands are using innovative technologies to deliver on the customization expectation while boosting customer engagement and building loyalty.


Most consumers are familiar with the capability to customize a product’s appearance, such as choosing the color and embroidery on a pair of sneakers or having a favorite photo printed on a smartphone case. This key trend speaks directly to the consumer’s sense of uniqueness and is helping to drive personalized interactions between consumers and their favorite brands.

Customization of products and packaging is an increasingly popular trend, according to Seth Moser, director of Consumer Goods and Services for the Accenture Customer Innovation Network, part of the international consulting firm Accenture. “Customization,” he said, “is driven by fashion, by marketing and by people’s demands for an experience that makes them feel special. An increasing number of new technologies are enhancing this trend as an in-store experience, as well as online. Customization is one place where we’re starting to see truly seamless retail, where e-commerce and bricks and mortar come together.”



For many consumers, Moser said, e-commerce has become the default choice for transactions. As a result, physical stores have begun to focus more on creating exceptional consumer experiences and offering personal interactions. “Even if you are ordering a product which is being manufactured elsewhere, you might still want to touch and feel some aspect of that product or pick out the colors yourself and put them in front of you in the store,” he said.

Not surprisingly, therefore, brands that have focused on offering customization primarily through their online channels are now bringing these offers into their physical stores, where the ability to see and touch products before buying is driving consumer demand. For example, Glasgow said, “in Brazil, the Esmalte Machine can customize nail polish in 100,000 shades. To meet the demand for customization in that category, the company is launching 60 new franchises this year alone.”


The Adidas Group, an athletic apparel company headquartered in Germany, focuses on delivering a multidimensional in-store shopping experience. Its HomeCourt concept stores include personal-experience elements, such as a digital shoe bar where customers can learn more about products simply by placing a shoe on a table next to a touch screen. Once the screen recognizes the shoe, shoppers can zoom in on a 3D image of the shoe, rotate it and explore the technology inside it to learn how the shoe is designed to maximize performance. If the shoe they’ve chosen proves not to be what they’re looking for, the touch screen’s shoe finder suggests alternate styles to meet their needs.

Adidas also has invested in developing innovative capabilities, such as its “#miZXFlUX” app, which enables consumers to print almost anything – from their favorite photo to an image selected from a themed gallery – onto their Adidas ZX FLUX shoes.



“For today’s consumers, real world and digital world are pretty much intertwined, with mobile devices increasingly becoming their ‘remote control for life,’“ Jan Brecht, CIO of Adidas, wrote in a recent blog for Adidas Group. “I think it’s essential for today’s economy, and of course for the sporting goods industry, to cater for changing consumer needs and expectations.”


As brands blend physical and digital channels in the realm of product customization, additional opportunities for personalization, beyond the product’s appearance, become possible. For example, brands are using sophisticated web infrastructures to build customizable experiences beyond their physical products.

“Brands can enable a customized experience by integrating their product with the digital ecosystem that they use to interact with customers,” Moser explained. “This is very scalable, so it makes a lot of sense for lower-margin products.” For instance, he said, “some food companies are starting to use labeling standards to provide a whole new digital level of interaction on top of the product.”

Coca-Cola’s “Share a Coke” campaign became a global phenomenon as consumers in more than 70 countries bought drinks labeled with their names, visited the online store to personalize glass bottles and shared virtual bottles with friends and relatives online. UK firm Appy Food and Drinks, meanwhile, partnered with food processing and packaging company Tetra Pak, headquartered in Switzerland, to include interactive technology on drink cartons. Consumers scanned the cartons with their smartphones to collect props associated with their favorite Nickelodeon cartoon characters. Consumers could then use their collected props to customize digital photos.


Emerging manufacturing technologies are also enabling increased customization.

“This trend might include a customized food mix for your dog, or sneakers with an orthotic insole manufactured just for your foot,” Moser said. “It has largely been driven by the medical industry where, for example, 3D printing might be used to create a cast that perfectly fits a patient’s arm. It’s been slower to emerge in the consumer goods space because the technology is a little more challenging, but it is likely to have a high impact over the next few years.”

Digital design tools are already enabling some brands to deliver tailored products while keeping costs down. Online apparel store eShakti, for example, with design teams based in New York, London and India, offers a range of designs that shoppers can customize to suit their individual style, size and height. The company says on its website that because it cuts each garment to order, it doesn’t need warehouses, which cuts costs.

The Adidas Group’s HomeCourt concept stores include personal-experience elements such as a digital shoe bar, where customers can get more product information by placing a shoe next to a touch screen. (Image © Adidas Group)

Developments on the horizon will support even more customization. US-based startup Electroloom, for instance, is developing 3D fabric printing technology that will enable users to design and print their own clothes. The technology has already captured the interest of brands and designers around the world.

“Our goal is to use the Electroloom to open the world of fashion design and manufacturing to everyone,” said Aaron Rowley, a biomedical/mechanical engineer who is co-founder of Electroloom. “We hope to enable people to work with Electroloom and do product typing for design by the beginning of 2016. Eventually, we hope to refine the technology so that it can be used in homes or stores to produce custom items. To provide that sort of experience, we’d build a storefront where people can submit their own designs or select from a range of custom templates.”


By putting the consumer at the center of the brand experience, mass customization emphasizes the importance of customized service and products throughout the shopping journey. Technologies that are aware of the context in which they are being used, as well as mobile interaction technologies – including Apple’s iBeacon for location- based services, near-field communication, electronic product codes and biometric devices – are enabling more interactivity than ever before, and brands are leveraging them to create truly customized shopping experiences.

“New technologies and approaches that brands are pioneering today will further solidify consumers’ expectations for customization,” Mintel’s Glasgow said. “In store, for example, retailers are using systems which can read shoppers’ emotions and provide deals based on real-time feelings. In the online/mobile space, retailers are using social media to better fulfill the needs of individuals, with initiatives such as using Facebook ‘likes’ to decide what goes on shelves and targeting Instagram users with personalized outfit recommendations based on their previously posted photos.”



Key examples include:
• Rebecca Minkoff’s flagship store in New York City, where shoppers can customize their perfect look with mood lighting in the dressing rooms and interactive touch-screen mirrors. The mirrors allow shoppers to switch sizes and colors in an instant, request recommendations based on other styles they have selected, or save the fitting session for a later date.
• Russia’s Synqera, which has created a cashier platform called Simplate that uses facial recognition and expression technology to offer discounts and shopping suggestions based on a customer’s identity and body language.


Brands need to think carefully about the level of personalization shoppers really want, however. In its recent study “Creepy or Cool,” global omnichannel personalization firm RichRelevance observed that while shoppers want digital personalization when they are ready to make a purchase, they may see some interactions – such as being greeted by name when they enter the store – as intrusive.

“You need to be ingesting the right information, from the store as well as the stream data that is automatically generated by websites,” said Matthieu Chouard, vice president and general manager for EMEA at RichRelevance.

Chouard cites Monsoon Accessorize, a UK-based retailer with more than 1,000 stores in 74 countries, as having an industry-leading vision for personalized shopping experiences. In 2013, the company began using in-store tablets to provide assisted selling and payment. Monsoon Accessorize also employs machine learning algorithms and a real-time decision engine with digital receipt technology to identify cross-channel buying behavior and provide highly targeted product recommendations and offers via emailed receipts.

“The more data you have, the better insight you can get from it and the more accurate you can be in terms of service delivery to the customer – from identifying products they might like to buy and knowing whether those products are in stock, to anticipating the type of service or interaction they would like to have with the retailer,” Chouard said.

Ultimately, mass customization promises to transform the products we buy and the way we buy them.

“We’re seeing constant developments in terms of what can be done, but there is still untapped potential,” Glasgow said. “Goods and services that can be continually customized will win with consumer tastes; smart technology can be employed to learn consumer habits for seamless customization; and in an increasingly digital world, there’s refreshing opportunity to enhance tactile customization experiences.”

The shipping survival guide

Juggling efficiency with bigger container ships

Greg Trauthwein

7 min read

As the prices paid to move cargo over long distances have plummeted, shippers have turned to ever-larger ships in search of volume-driven savings. But that strategy is putting a massive financial burden on the ports needed to service them, creating friction between shipping interests that depend on one another to survive. Both sides are looking to information technology improvements for relief.

Shippers of containerized cargo know exactly how to lower their costs: move more cargo per ship. As a result, the industry is scheduled to welcome its first 20,000 TEU (twenty-foot equivalent unit) ship in 2018 – a leviathan so big that few ports in the world will be able to service it.

For shippers, the motivation for bigger ships is a simple question of economics. “The cost of moving a box from Hong Kong to New York in 1973 was about US$5,800 per box; today it is about US$2,500 per box,” said Richard Larrabee, former director of port commerce for The Port Authority of New York and New Jersey, a decline made possible by rapidly escalating volumes of cargo moving on larger and larger ships.

Growth in demand for containerized shipping is historically strong, projected to increase by 6% to 7% this year, according to RS Platou Economic Research, a division of the Oslo, Norway-based RS Platou Group, a leading international broker and investment bank serving the offshore and shipping markets. Meanwhile, Tokyo-based International Association of Ports and Harbors (IAPH), a nonprofit organization, reports that global cargo levels in 2013 reached more than 651 million TEUs. While the growth rate in demand for space is respectable, ship capacity grows in tandem, with an estimated 1.6 million to 1.9 million TEU of new shipping capacity scheduled to hit the water in 2015, a net fleet expansion of nearly 7%.



But “container shipping is a low-margin industry,” said Keith Svendsen, vice president, Operations Execution, Maersk Line, the global container division of the A.P. Møller-Mærsk Gruppen, a Danish conglomerate. “In 2014, only Maersk Line and three other lines reported significant positive earnings before interest and tax margin. The industry’s margin as a whole averaged 3.1%. Profitability is, to a high degree, dependent on the industry’s ability to lower costs and increase efficiency. The main lever is economies of scale, illustrated by the increase in vessel size.”


As ships grow bigger the channels, ports, facilities and associated logistics services must grow as well, a reality that is creating friction between shippers and the facilities that serve them.

“I believe the defining trend is the owners’ pursuit of optimum ship size, which appears to be toward bigger carrying capacities and lower unit costs,” said Shashi Kumar, professor of International Business and Logistics and academic dean at the United States Merchant Marine Academy (USMMA).* “Everything else is driven by this: the widening of the Panama Canal and improvements to the Suez Canal; new investments in ports and terminals; dredging deeper channels; and even the motivation (for carriers) to form alliances.”

Port facilities and terminal operators complain that ship owners have forced them to bear the brunt of the burden to invest, without a sufficient increase in volume to offset the cost.

“These ultra large container vessels put us, as a port operator, under pressure,” said Mohammed Al Muallem, senior vice president and managing director, UAE Region, DP World, a global container handling specialist with more than 65 marine terminals in Dubai, one of the seven United Arab Emirates. “On the one hand, they help shipping lines to reduce unit costs. But they also require the port industry to invest in longer berths, deeper drafts and bigger cranes to translate on-water economies of scale to land.” DP World has invested more than US$6 billion over the past five years, Al Muallem said, adding capacity and upgrading infrastructure of its terminals, including its flagship Jebel Ali facility in Dubai.



The story is the same in New York, which has invested approximately US$5 billion over the past decade on dredging channels from 40-foot depths to depths of 50 feet, along with other port and facility improvements. Such investments must be made on faith, the Port Authority’s Larrabee said. “Digging a deeper channel, raising the Bayonne Bridge … it doesn’t necessarily mean that more cargo is going to come.”

Naturally, shipowners see the equation somewhat differently. “High-quality infrastructure is an important determinant for economic growth,” Svendsen said. “While shipping lines invest to make global trade more efficient, governments and ports are not following the same path. More collaboration between shipping lines and terminals is key to improving efficiency going forward.”

Shipping lines that want to find berths for their new mega-ships don’t expect ports to do all of the work on their own, Maersk’s Svendsen said. For example, Maersk helped ports prepare in advance of its world-record-setting Triple-E vessels, where “Triple-E” stands for efficiency, economy of scale and environment. “Maersk Line worked with ports in Asia and Europe to assist with the preparation for larger ships,” Svendsen said. “Such preparations included infrastructure adjustments (water draft, length/width of quays or turning basins, crane upgrades) and pilot training, for which Maersk Line provided data for simulator use.”


Movement of cargo via container ship is a delicate global dance; disruptions caused by severe weather conditions, labor disputes or political unrest can bring the machinery to a grinding halt. For example, the nine-month port strikes on the US West Coast in 2014 and early 2015 gave the industry a multibillion dollar economic wallop. Store shelves were left empty, employees were laid off, and dozens of ships sat at anchor off West Coast ports for weeks as produce rotted and orders went unfilled. The stoppage forced shippers to redraw well-established trade patterns, permanently moving some Asia-to-US cargo from the geographically closer and more cost-effective West Coast to US Gulf and East Coast ports.

Weather, labor and politics are all unknown variables. What is known is the undeniable increase in the size and complexity of ships, ports and the logistics chain, which creates equally large and difficult scenarios to optimize system efficiency. The US West Coast situation convinced many experts that bigger ports are not the sole solution to bigger ship sizes; increasing the flow of cargo into fewer, larger facilities simply created bigger ripples throughout the logistics chain.

“(Bigger ships), in terms of fuel and cargo efficiency, seem to be very effective,” said Marco Pluijm, ports and marine sector manager for Bechtel, one of the world’s largest engineering, construction and project management companies. “However, at the same time, it puts stress on the port and terminal owners and operators who have to handle these bigger ship sizes. The top three impediments (to efficiency) are port and terminal handling capabilities, hinterland connections and the availability of suitable financial means.”

For example, Pluijm said, the US East Coast, where major ports have been upgrading to prepare for the new Panama Canal (the new Panamax ships will haul as much as 12,500 TEU) coming into operation in 2016/17, now need even more investment to accommodate the 20,000 TEU vessels that began operating when the expanded Suez Canal opened in August 2015. “Not each port will be able to do so,” he said.


Some of these impediments, the USMMA’s Kumar said, can be eased with better information delivered earlier.

“Central to all of this is the speed of the information from the ship to the terminal,” he said. “Without IT support in its myriad forms there would be utter chaos, from port and shipping operations to cargo clearance to forecasting and data analytics. There can be no supply chain efficiency without real-time information, and that’s only possible because of all the advances we continue to make in IT.”

Regarding ships at sea, Maersk’s IT investment is best exemplified by, but not exclusive to, its investment in the Triple-E class, Svendsen said. “We are increasingly connecting the fleet with the shore organization to provide guidance in real time on optimal routes, port stays and speed. This has allowed Maersk Line to realize significant savings.”

Record Guiness de Carga - Maersk (Image © Maersk Line)

Like shippers, ports are investing in IT as well. “We are constantly investing in our operations systems, procedures and software,” DP World’s Al Muallem said. “We have a strong IT team developing software tools that can help make the entire port more efficient. Getting the containers off the ship is just the first step in a multistage process from ship to evacuation from the port.”

DP World has dubbed its Jebel Ali Container Terminal 3 its “terminal of the future.” “T3” features 19 of the world’s largest and most modern quay cranes, remotely operated from a quayside control room. The cranes – along with smart mobile applications, a 100% electronic transaction facility and an advanced Gate Automation system – are all designed to contribute to a cost-effective terminal.

In New York, meanwhile, the port’s most visible IT investment is the new Terminal Information Portal System (TIPS) from Sustainable Terminal Services. TIPS provides all port users with comprehensive web-based access to centralized information on container availability, booking status as well as terminal-specific news. The power is in the consolidation of information.

While much headway has been made, more work lies ahead. Kumar, for one, believes port and terminal operators can learn how to handle the world’s largest ships by studying how airports handle the world’s largest airplanes.

“When airlines started using the (Airbus) A380s (which can carry as many as 853 passengers), the embarking and disembarking of passengers were also speeded up,” he said. “No additional time is required in airports to handle the bigger planes, unlike what is going on with the bigger container ships. We need similar paradigm changes in port operations to keep up with increasing ship sizes. There has to be increased automation and even greater use of information technology to make this happen.”

* The views expressed in this article by Shashi Kumar are his own and not those of the United States Merchant Marine Academy, the Maritime Administration, the US Department of Transportation or the US government.


While much attention is paid to the size of ships, the depths of harbors and the onshore capabilities to move containers through the system, Bechtel’s Marco Pluijm, ports and marine infrastructure sector manager, is a proponent of building man-made offshore ports to ease congestion and increase efficiency of container shipping operations.

In the Bechtel model, deep sea vessels discharge their cargo via the offshore hub. From there, it is transferred to barges, which transport the cargo to one or more nearby ports. Distributing the cargo in this way spreads the burden across a wider area, rather than pumping more cargo into already congested corridors. Offshore hubs also reduce the need for deeper dredging of existing ports, reducing the ecological impact that would come with expanding existing ports, Pluijm said.

Making flight fun again

Engineering an improved air travel experience from beginning to end

Tony Velocci

3 min read

From 3D visualization technology that helps engineers predict and refine passenger comfort in business jets, to futuristic entertainment systems on commercial airlines and personalized navigation tools to speed passengers to their gates at airports, the air travel industry is scrambling to improve passenger experience.

When US-based Gulfstream Aerospace was developing its all-new G500 and G600 business jets, one of its goals was to optimize the aircraft to deliver a unique combination of speed, range, the most advanced cockpit available anywhere and a superior cabin experience.

To help verify that they were creating the best blend of form, function and efficiency, engineers used Gulfstream’s cave automatic virtual environment (CAVE) to conduct design reviews and support systems integration.

CAVE uses a high-definition active stereoscopic projector to display fully immersive 3D images on the walls of a room-sized cube. At different stages in the airplane’s development, engineers use head-mounted 3D glasses equipped with sensors for motion tracking and a control device to interact with life-size projections of 3D models of the aircraft cabin in real time – troubleshooting on the fly, trying various designs, changing fabrics and other materials to experience the overall look that customers have selected for their aircraft.

The immersive experience is generated by real-time 3D visualization software using models created by the aircraft’s design engineering team. Gulfstream’s goal is to drastically reduce the need for expensive physical prototypes as engineers develop what customers had asked for: aircraft with different mission ranges that still embody the elegant Gulfstream experience. The first G500 developed in this way is scheduled for delivery in 2018.

“This is a very powerful, world-class engineering application that is helping us to speed up the design process, but we’re still improving it,” said Fernando Toledo, a Gulfstream Aerospace engineer who specializes in virtual reality simulation. “The ability for us and our customers to be able to experience the product through full immersion in photorealistic three dimensions is invaluable.”



Gulfstream’s CAVE is just one example of how aviation companies — from original equipment manufacturers (OEMs) to airports — are using technology to improve air travelers’ experiences from departure to arrival. The goal: to delight customers, differentiate themselves from the competition and decrease time and cost to innovate.


Innovation that appeals to the customer’s appetite for technical advances and a superior user experience drives ferocious competition among business aircraft manufacturers, with communications and in-flight entertainment systems that replicate or even exceed what consumers enjoy on the ground.

Canada-based Bombardier, for example, is the first business aviation manufacturer to offer the JetWave Ka-Band satellite connectivity system developed by Honeywell Aerospace on some of its long-range models, including Bombardier’s Global 5000 and Global 6000. Starting in 2016, this capability will provide business jet passengers with high-speed, in-flight Internet connectivity virtually anywhere in the world.

“Our customers want to be online everywhere they go,” said Tim Fagan, manager of Industrial Design on the Global 7000 and Global 8000 projects. “Soon they will experience the same level of connectivity in the air that they have come to expect in their home or office.”

Commercial airlines are under similar pressure from their customers to introduce new technologies faster, according to Yann Barbaux, chief innovation officer for Airbus, which builds commercial aircraft for airlines worldwide. As a result, Airbus wants to shorten the cycle time for integrating innovations throughout the aircraft it builds, dropping it from a span of two to three years to a span of only a few months, he said.

To help achieve this goal, the airplane builder is increasing its use of both automation and robotics. In addition, Airbus, supported by its technology vendors, is exploiting the use of advanced software to help airline customers improve flight operations – including which type of aircraft will provide the most economical service on particular routes. “Digital technologies are evolving ever faster, and we intend to lead this evolution,” Barbaux said.

British Airways also intends to be a pacesetter. The commercial carrier effectively reinvented its first-class cabin in the new Boeing 787-9 Dreamliner it operates, with individual suites that allow passengers to experience gate-to-gate entertainment without having to stow their television for takeoff and landing. In-flight entertainment is controlled using a new handset similar to a smartphone that is integrated into the seat. In addition, passengers can dock the handset, allowing them to use one menu option on the handset, such as the moving map, while watching something else, such as a movie, on the 23-inch fixed screen. Discreet stowage areas adjacent to each armrest conceal convenient power outlets.


On the ground, airport connectivity is set for a boost in the fall of 2015, when a joint task force of airports, airlines and SITA, a specialist in air transport communications and information technology, is scheduled to publish a set of standards and governance rules for how wireless technology should be used at airport terminals.

When a passenger is in the vicinity of an iBeacon — compact, battery-powered Bluetooth transmitters placed in strategic locations throughout the terminal — an airline or airport-developed application on his or her mobile phone “wakes up.” At London’s Gatwick Airport, beacons identify travelers by their smartphones and provide GPS-style directions to their gates, pointing out food or shopping along the way.

Some airports, including Dulles International near Washington, DC, use facial recognition systems to speed passengers through passport control. The objective: to make airports more hospitable, speed up travel and enable air travelers to interact with their environment and — as unlikely as it may sound — make air travel enjoyable again.

Stay up to date

Receive monthly updates on content you won’t want to miss


Register here to receive a monthly update on our newest content.