Damage-free deliveries

Packaging simulation technology helps to reduce shipping damage

Lindsay James
30 July 2018

4 min read

With the surge in e-commerce, consumer packaged goods companies are shipping more direct to consumers – and that means packaging must be designed to prevent damage and facilitate easy returns. Simulation technologies are helping CPG companies deliver better customer experiences.

The rise of e-commerce has changed the way people shop – and the way brands sell. Instead of offering their wares almost exclusively in stores, consumer packaged goods (CPG) companies are increasingly shipping direct to consumer (DTC).

However, the DTC model routinely fails in one area: packaging.

“Traditionally CPG companies designed packaging solutions with the retail shelf in mind,” said Eric Hiser, technical vice president at Michigan-based International Safe Transit Association (ISTA), which develops testing protocols and design standards for packaging. “Products are packaged in bulk and put on a pallet – they have ‘strength in numbers’ and there’s relatively few touchpoints along the distribution chain. But this all changes in a DTC scenario.”


The DTC model involves four times as many transfers or “touch points,” which increase the risk of damage. “Whereas traditional retail relied on an average of five touchpoints, e-commerce tends to involve 20 or more,” said Kyla Fisher, a sustainability analyst and program manager at The American Institute for Packaging (AMERIPEN), headquartered in Minnesota.

 “In traditional retail, pallets of goods go from the manufacturer direct to the retail distribution center, where they would be unloaded and then reloaded onto smaller transport trucks for delivery to retail,” Fisher said. In e-commerce scenarios, however, shipments go from the manufacturer to a delivery hub, where they are unpackaged and stored. When orders arrive, they are repackaged, loaded onto small trucks, and then unloaded and reloaded at multiple hubs until they reach the recipient.

Returns multiply touches even more, and e-commerce involves a high rate of return – 30% versus 9% in the brick-and-mortar world, according to the 2017 AMERIPEN whitepaper “Optimizing Packaging for an E-commerce World.”

Fisher believes that damage rates are closely linked to packaging failures. “Not having enough packaging material has significantly increased the risk for damage,” she said. “But consumers are also equally as frustrated by overpackaged products.”

“Whereas traditional retail relied on an average of five touchpoints, e-commerce tends to involve 20 or more touchpoints”


Damaged deliveries eat into companies’ revenues and brand reputations.

“The cost of replacing a destroyed item can be up to 17 times the cost of shipping, and negative website reviews resulting from the destruction of an item can take months to counterbalance with positive ones,” the Virginia-based Association for Packaging and Processing Technologies (PMMI), observed in its most recent “E-Commerce Market Assessment.”


It’s clear that something needs to change. “For e-commerce situations, the entire package needs to be redesigned and re-evaluated from a distribution and end-use point of view,” said Sumit Mukherjee, vice president of Advanced Engineering Services at Ohio-based ‎Plastic Technologies, a designer and manufacturer of plastic packaging.

The answer, experts agree, lies in packaging designed and tested for heavy handling – and computerized simulation does the job better, faster and less expensively than physical prototypes. Until recently, however, most packaging has been tested by creating and moving physical prototypes – a time and cost-intensive process.

“For us, this involves testing a small number of containers in controlled and consistent environments following defined procedures,” said Hansong Huang, head of Advanced Engineering at Amcor Rigid Plastics, a packaging solutions company based in Detroit. “However, the product validation is defined entirely on the behavior of tens of thousands of containers in their filling lines, supply chain and customers’ hands in pilot production, sometimes done by the customers of our customer.”

The problem with that process, Huang said, is that “the validation phase, in most cases, is not in our control, has large variations and is very costly and time consuming if something goes wrong.”


The cost of replacing an item destroyed by shipping can be up to 17 times the cost of shipping

Recent advances in computerized simulation and testing technology remove the guesswork.

“Simulation techniques will help us answer the ‘how?’ and ‘how much?’ questions in a scientifically iterative and innovative process,” Mukherjee said. “Identifying a robust design, coupled with accurate material properties, will allow simulation of the different scenarios of transportation with possible exposure to high temperatures and pressures. The entire solution space can be skimmed for optimal points, where best performance can be had.”

Amcor is already leveraging the technology, to great effect.

“By building simulation specifically for e-commerce, we can rapidly gain understanding of the physics and actual requirements for the containers to pass,” Huang said. “We already use simulation to design for robustness and testing extreme conditions that are difficult to replicate reliably in physical testing. I expect it to play an even more important role for e-commerce situations in the years ahead.”


The benefits are clear. “By using simulation to lead the design and testing method development, instead of developing simulation to replace existing testing as in the past, advantage is gained through not only shortening the timeline but also developing better methodologies at the end that can be widely applied,” Huang said. “At Amcor, we have already built and validated the technical foundations and most of the components needed for accurate simulation for e-commerce requirements – the thickness distribution prediction, material properties, numerical methods, verification and validation, workflow, etc. We now need to extend this to include a full set of new loading conditions and, perhaps more importantly, adapt our methodology to live up to a new level of uncertainty in the new journey.”


Hiser believes that in the future this progress will enable CPG companies to more rapidly adopt new innovative delivery methods. “They can take existing packaging designs and simulate their performance using new channels such as curbside pickup, drones, robots…all manner of future scenarios,” he said.

“The benefits will be huge,” Mukherjee added. “Firms that do not take advantage of these digital tools will ultimately be left behind.”

For information on designing the Perfect Package, please visit:

Smart cities

How digitalization helps rapidly growing urban areas remain livable

6 min read

Antoine Picon is a French engineer, architect and historian who has published 20 books about urban planning and architecture, taught at the École Nationale des Ponts et Chaussées and the Harvard Graduate School of Design, and been honored as one of the 2014 Mellon Senior Fellows of the Canadian Centre for Architecture (CCA). Compass spoke to him about cities’ growing challenges.

COMPASS: Throughout the world, population is becoming increasingly concentrated in cities. What challenges does this present, and how are urban planners responding?

ANTOINE PICON: It is indeed a substantive shift which, in some regions of the world, is very quickly gaining speed. Cities are playing an increasingly strategic role in the global economy. In light of this acceleration, there is somewhat of a crisis in traditional planning tools. Cities have become so spread out, so complex, and are evolving so quickly that traditional urban planning solutions no longer meet the needs of planners. The success garnered by the first proposals of digital cities and the global movement toward smart cities are exactly in line with this expectation of new tools for managing cities. Another essential point concerns environmental uncertainty, related to climate change and the proliferation of extreme events.

Many cities are threatened. Populations are attracted to the coasts, and it is especially in these areas that the effects of climate change will be felt the soonest. A third important point is the increased competition between cities that are vying with each other to attract talent, companies and capital. This trend correlates with discussions about the knowledge economy, according to which a city’s competitiveness comes from its ability to attract brainpower, that is, to successfully synthesize academia, research and cutting-edge companies. What’s more, when we speak of cities, we need to speak more specifically about urban metropolises, even though many cities have had difficulty in abandoning the municipal framework. More and more problems, particularly environmental, are emerging at the regional level.

What is the role of digitalization in the recent evolution of cities?

AP: In La ville territoire des cyborgs, I explained in a somewhat provocative way that the cyborg is the contemporary city what the ideal man was to the regular city imagined by Leonardo da Vinci in the Renaissance. Let me explain.

You have technologies, like the car, which have had a direct impact on the urban form. Digital has very profoundly changed the nature of the urban experience, but has not yet altered its form. Today, all cities are experienced and used in augmented reality: thanks to your smartphone, you are simultaneously in the street and in the digital world, like a kind of cyborg. It is a worldwide phenomenon and is profoundly changing modes of sociability, the way the city is used and even the way we look for a restaurant.

We are no longer the same city dwellers we were during the first industrial revolution or even during the middle of the 20th century. Now, it seems abnormal to us not to know how many minutes we will have to wait for our bus in Paris. Twenty years ago, the bus arrived “within a certain time.”

As for the impact digital has had on the urban form, in my opinion, we are only at the start of the process. Digital has slipped into everything, from dishwashers to cars, from stoplights to Velib bike sharing, but it has not yet profoundly modified the urban form. However, it has completely changed the modes of managing cities. Today, we are able to generate huge amounts of data that make it possible to follow certain aspects of the urban metabolism in real time, and we can cross-check this data in a much finer way than before. This does not mean, however, that all problems have been resolved.

Indeed, it seems that the city itself is not changing a great deal.

AP: We can’t forget that a city is more than a physical reality. A city is made up of the millions of urban experiences and people who live there. The historically constructed framework can be preserved, but we are no longer living in the same inner Paris we used to. Cities have in a way been transfigured by digital; that is, they are lived, understood and experienced in a different way.

The two books I’ve written on smart cities were related to my observation that the theme of smart cities came primarily from the digital industry, while the traditional players in architecture and urban planning had made very little use of it, especially in France. I explained that the smart cities movement would have enormous consequences on the way cities are designed. We are moving from a traditional city of flows to a city of happenings, of events.

Geolocation represents one of the greatest disruptions in the relationship humans have to space. It is somewhat amusing to see to what degree everyone finds this completely natural. The real disruption is not that we no longer have to unfold a map, but that our life takes place on a map. The planet has become a map, a bit like the famous text by Jorge Luis Borges  [On Exactitude in Science], on the map of the empire that is the same size as the empire. And this is one of the most fascinating aspects of the digital world, which allows us to zoom in and out continuously, from the planet to our garden and back again.

Antoine Picon, Urban Planning author, historian and planner (Image © CCA)

It also carries many risks. The first is to imagine that reality is made the same way, with a risk of losing landmarks and scales. The information world is a world that spontaneously has no scale, unlike the traditional physical world.

How can digitalization integrate citizens into the urban future?

AP: We now have tools that allow us to consult with citizens, obtain feedback and communicate. This is one of the assets of digital technology. There are still problems, however. Any mayor or technical department will tell you that this ease of communication can prove to be very disruptive!

The problem is that you can make a lot of noise on the Internet without being very numerous in the physical world: a very organized group can mobilize against a project without being especially representative. It’s a problem for urban planning, which used to begin from the principle that people needed to be represented based on their number in reality. On collaborative sites, 10% of the people author 90% of the contributions. This is true for Open Street Maps, and it was already true in the first online communities. On the Internet, as in many other areas, very few people are behind the majority of the contributions. One of the major challenges in the coming years will be to move past this issue. We will need a new civic education, reconsidered for the digital era.

Furthermore, smart cities cannot be reduced to the matter of technical management because the city remains fundamentally a political reality. Cities are not factories or machines. Even if their operation needs to be optimized, they cannot be reduced to a series of processes to roll out. It is much more complicated, and cities need collective imagination, hope and dreams. They are places of tension and conflict, and imagining that they can be managed in a completely smooth way is, in my opinion, taking a very big risk.

What do you foresee for the development of platforms used for digital exchanges, data management and situational understanding? Is it inevitable?

AP: I think that it is certainly one of the most promising directions to explore today, and this is the reason why digital city projects are developing so quickly. Many social science researchers believe that we are in the midst of moving from a traditional way of thinking about infrastructure to thinking about platforms, and I am fairly convinced that something along these lines is being worked out. With digital platforms it is possible to combine more important initiatives, both quantitatively and qualitatively; more numerous, and in a wider range of fields.

Traditional infrastructure operated based on a very marked distinction between managers, operators and users. Platforms no doubt make it possible to be more flexible and to involve very different players. Platforms open up new possibilities in a world in which traditional roles seem a little too limited. They also enable combining traditional infrastructures and encourage intermodality. They can integrate resources of very different types, which is also one of their strengths.

The people who are connected to platforms are no longer solely users. This relationship can come in a variety of forms. Not everything is completely clear yet, but we are in a period of change. Finally, I am interested in the epistemology of cities and technical systems, and I tell myself that in the transition currently taking place, something fundamental is occurring.

The platform model that facilitates exchanges is beginning to become central, whether in terms of information or physical movement. Are networks therefore essential?

AP: Networks will in any case be managed by platforms; that’s what is happening: a city’s transportation system is integrated into a platform, to certain degree.

To return to my idea about flows, what I find very striking is that with Uber, in the end, they are managing occurrences, meetings between a vehicle and a user. We are moving beyond the traditional, aggregate figure of the flow, to connect with issues related to collisions between particles.

This is also what is changing the nature of the representation of the network. A traditional network managed flows. Today, we are managing occurrences that are much more atomized or individualized, with much greater granularity. What platforms also allow is to considerably increase an image’s resolution, to zoom in to a certain degree. And on this point, in my opinion, we may still tend to underestimate the revolution this represents. But here again, a very profound transformation is taking place.

Get more information on 3DEXPERIENCity

Capturing knowledge

Product lifecycle management is a potential game-changer in financial services

Miriam Gillinson

3 min read

PowerPoint presentations, Excel spreadsheets and even email are considered increasingly outdated forms of technology in our everyday lives, yet they still play a significant role in product development in the asset management industry. But with competition increasing, profit margins tightening and regulations skyrocketing, the asset management industry is turning to product lifecycle management solutions.

Product lifecycle management (PLM) is a relatively new term within the asset management industry. The discipline tracks and enables the entire product development process – from initial brainstorming to product development, refinement, regulatory review and product launch – and it’s gaining traction in financial services.

Common in the engineering and manufacturing industries, PLM is a long-overdue innovation in financial services, said William Lucken, global head of products at Allianz Global Investors, a global assets management company based in Frankfurt, Germany.

“Asset managers have a bunch of sophisticated tools to allow them to be great portfolio managers," Lucken said. “But from my experience within the products world, most asset managers don’t demonstrate that same level of sophistication."

It’s time for asset managers to advance beyond routine tools that create no audit trail, including spreadsheets and email, he said.

“We’re using technology from 30 years ago without questioning it,” he said. “It’s no longer acceptable that we can run our business in that way when the rest of the world and our daily lives [have] transformed from a technological perspective.”


As Abhijit Rawal of PwC observed at a London Business School conference in 2017, the asset management industry has been transformed by the “four R’s”: returns are low, revenues are being squeezed, regulations are being tightened and the robots are coming to gobble up business.



As a result, asset managers are waking up to the importance of integrating new technology into the entire product development process. But it’s not just about updating outdated technological frameworks – it’s also about offering products that will survive and thrive within an increasingly competitive and volatile financial industry.

“The volatility and the changes to the market mean that you might make a mistake in your development process because you don’t have adequate control of the information,” Allianz’s Lucken said. “A PLM platform should allow a company to better record the decisions they make – why we make certain decisions based on particular data – so that we are more in control at every stage.”


Seismic shifts in macroeconomics also have impacted customer needs and expectations. Jaspal Sagger, the Baltimore-based head of international product at global asset management firm Legg Mason, said customers are now looking beyond benchmarks like relative investment performance when it comes to investment strategy.

“Asset managers used to compete on performance alone, earning healthy profits on assets raised in the good times,” he said. “The landscape is very different today – intense competition has squeezed margins, while investors are now savvier about the dangers of chasing returns. Performance will always be important, but we expect a continued shift towards outcome-orientated strategies that address specific client needs, and not only changing economic circumstances but also demographic or regulatory challenges. Clients are looking for partners that go beyond performance.”

Sagger said that distributors are increasingly looking to partner with fewer asset management firms. Therefore, managers with disciplined PLM processes will have far greater control over their future success.

“This is particularly true when PLM is focused on delivering products and vehicles that are well designed, differentiated, fairly priced and specifically focused on clients’ needs,” Sagger said.


In the past, products might have been delivered via PowerPoint presentations and haphazardly archived over emails.

But a more rigorous approach to PLM – in which every stage of the product development process is tracked and recorded – could save significant time and money, and prevent good ideas from being lost when their champions change jobs.

“Investment themes tend to be fairly cyclical,” Allianz’s Lucken said. “Ideas that were out of fashion 10 years ago might suddenly be revitalized and looked at again. But if you don’t have a tool that records everything you did, the decisions you took and why you took them, then your fate is just to repeat the cycle.”

An efficient PLM process can greatly boost a company’s corporate memory, one of the key reasons for its increasing prominence within the asset management industry.

“The [PLM] discipline is growing out of its traditional role as an engineering system and is increasingly recognized as a key piece of the company’s infrastructure and – where the crown jewels are held – their intellectual property,“ PwC’s US PLM leader, Tim Burks, said recently in online business journal Raconteur.

Not only is an idea never forgotten with PLM; if product development is handled efficiently and coherently, then all ideas generated should line up with the company’s overall strategy.

“Really good product development only occurs in investment management when everyone understands what the benefit of the vehicle is to the end client,” Lucken said. “If people aren’t fully aware or if they’re only a minor part of the process and don’t have the data available to them, then they don’t see the bigger picture of what the product is trying to achieve.

“A lot of people go to work everyday and they just turn the handle,” Lucken said. “But good PLM should allow everyone in the business to understand what is going on and why.”

For information on improving product development with PLM, please visit:

Crewless ships

Marine companies explore the possibilities of autonomous vessels

Rebecca Lambert

4 min read

One of the world’s first fully automated electric container ships, the Yara Birkeland, is set to take to the water in late 2018 and is expected to be entirely self-guided by 2020. Similar vessels are in development, signaling the rise of a new generation of ships that rely on intelligent software to safely navigate the seas.

On the west coast of Finland lies Jaakonmeri Test Area, a patch of open water reserved for testing autonomous vessels. Managed and controlled by the Finnish-based public-private innovation hub DIMECC (Digital, Internet, Materials and Engineering Co-Creation), it is one of the first sites in the world available to anyone wishing to test ships that can navigate safely – without a crew.

DIMECC’s goal is to help set the standards for autonomous ships by 2025. Through its One Sea initiative, DIMECC is working alongside its global marine expert members to initiate research programs, educate industry leaders and help to establish suitable rules and regulations around safety, sustainability and future technology developments.

The reason behind the group’s commitment? The group's belief that autonomous technology will bring substantial benefits to ship owners.

“The main benefits are improved safety, sustainability, cost efficiency and the creation of new revenue models,” One Sea lead Päivi Haikkola said. “Increased autonomy can improve safety at sea, as a large proportion of accidents are caused by human error [insurance firm Allianz estimates human error accounts for 75%-96% of accidents]. Cost efficiency comes from both decreased capital and operational expenditure; for example, decreasing fuel consumption due to better voyage planning and execution.”

The 2016 report “A pre-analysis on autonomous ships,” by the Technical University of Denmark on the technology’s potential echoes Haikkola’s arguments. It suggests that companies will lower costs substantially by reducing crew and support workers, most notably on smaller ships engaged in near-coastal voyages. These include small island ferries, tugboats, barges and supply and service vessels.


“Due to regulatory restrictions and technological limitations, the first vessels to go autonomous will be those operating in coastal waters and inland waterways,” said P. Michael A. Rodey, senior innovation manager, Technical Innovation, at Danish transport company A.P. Moller – Maersk Group. “These segments are where we have the lowest risk, best connectivity, soundest business case and least technological challenges.”

Maersk works with partners that include Rolls-Royce to explore the potential of remote-control and autonomous shipping technology. In November 2017, it launched Svitzer Hermod, which it describes as the world’s first remotely operated commercial vessel. In April 2018, Maersk confirmed that it is testing perception and situational awareness technology from US-based Sea Machines Robotics aboard one of its new Winter Palace ice-class container ships.

Completely autonomous or unmanned vessels are not Maersk’s goal, however.

“If we use the Svitzer Hermod as an example, this was driven from a technology perspective when it was sanctioned in 2015,” Rodey said. “Back then, autonomous vessels were just starting to be discussed as a coming reality, and Svitzer [Maersk’s towage company] had already been developing some ideas around this for the past few years. Svitzer Hermod was initiated as a proof-of-concept to drive the technological development with the aim of separating fact from fiction, and proving the technology worked, was safe and value adding.”


Norwegian maritime technology provider Kongsberg, meanwhile, is delivering the technology for Yara Birkeland, an open-top container ship launching in 2018. Kongsberg has announced that the vessel will be the world’s first fully electric and autonomous container ship with zero emissions and ballast. Kongsberg is providing the sensors and integration required for remote and autonomous operations, along with the electric drive, battery and propulsion control systems.

Yara Birkeland is expected to sail, without a crew, within 12 nautical miles of the coast between three ports in southern Norway by 2020. When operational, the ship will be remotely monitored by three centers that will provide emergency and exception handling, condition monitoring, operational monitoring, decision support, surveillance of the autonomous ship and its surroundings, and other aspects of safety.

“Simply put, a highly automated navigation system for a vessel requires three main technologies: sensors (cameras, lidar, radar, AIS, GPS, compass); connectivity (4G/5G, satellite and mesh); and software (algorithms for collision avoidance and navigation and vessel health management),” Rodey said.

Although it is possible for ships to sail autonomously, he said, the company has no plans for ships to operate completely without human intervention. “Most people tend to view autonomy as ‘free from human interference,'” Rodey said. “So in the maritime industry, do we mean navigation only, or the vessel as a whole? Since vessels are composed of complex machinery that operates in a very harsh environment, there is no such thing as a vessel which can be free from human input for maintenance. Likewise, for autonomous navigation, unless true sentient artificial intelligence is developed, which is not likely in our lifetime, then human input will still be needed for operations.”


In an interview with World Maritime News in April 2017, a Rolls-Royce executive said that the technology will be in common use in less than a decade.

“By 2025, we hope to have a remotely operated vessel at open sea, and five years after that we expect unmanned oceangoing vessels to be a common sight,” Oskar Levander, vice president of innovation, marine at Rolls-Royce, was quoted as saying.

One Sea’s Haikkola agrees.

“The possibilities are there – all the participants in our ecosystem are the best in their fields,” he said. “And the technologies are already there in the specifications of many ships that are on order at the moment.”

Yet work remains before the world will be truly ready for autonomous ships.

“Public acceptance and awareness of what autonomous technologies can and cannot do must increase,” Haikkola said. “We will also need rules to accommodate autonomous technologies.”

Maersk’s Rodey echoed those sentiments.

“Regulations, both on a national and international level, have not kept pace,” he said. “Likewise, the technology, while it is getting there, is still not yet proven and comes at too high a cost. Be that said, things are moving very quickly, and we will see the first autonomous vessels rolling out over the next year or two.”

For information on Marine & Offshore solutions, please visit:

Consumer-centric innovation

Consumer goods and retail companies evolve to suit consumers’ lifestyles

Jacqui Griffiths

3 min read

Wellness, comfort, convenience and safety are perennial concerns that touch every aspect of consumers’ lifestyles. Increasingly, leading brands and retailers are putting these interests at the center of their innovation strategies.

Wellness, comfort, convenience and safety touch every aspect of consumers’ lifestyles. Increasingly, leading home goods brands and retailers are putting these interests at the center of their innovation strategies.

Britax Römer, a global manufacturer of child car seats, pushchairs and bike seats, continuously evolves its products to meet the highest safety and quality standards while addressing the lifestyle needs of modern families.

While the strategy is paying dividends – Britax Römer has grown into one of the world’s most trusted child safety product brands – it complicates management of the company’s supply chain to offer so many options while ensuring material safety.

“Parents are demanding high-quality products now more than ever,” said Moritz Walther, marketing director for EMEA at Britax Römer. “They are careful to ensure that the products that come into contact with their children have low levels of harmful chemicals and are safe to use. Additionally, they expect as much performance, comfort and ease of use as possible, as well as products that reflect their personal style. As a child safety manufacturer, we look to offer safety not only in terms of the features we provide, but also how and where the products are made.“

Britax Römer is a prime example of a trend driving many makers of consumer home goods today: the need to accelerate innovation and deliver on consumer demands.

“There is an emphasis among consumer product companies on innovation excellence linked to customer experience at scale – creating and monitoring customer experiences for individuals and groups,” said Ivano Ortis, vice president of US-based IDC Manufacturing Insights and IDC Retail Insights. “At the same time, the number of new products has increased. This creates operational issues, not least in terms of the supply chain’s ability to support the pace of innovation.”


Health, wellness and convenience are key areas of focus in these endeavors.

“Marketing and innovation efforts have improved in the key areas today’s consumers are focused on – health and wellness, entertaining and convenience,” Joe Derochowski, executive director and industry analyst at US market research organization NPD Group, wrote in a 2018 blog. “Continue developments in innovation will be critical to staying top of mind with consumers, but those innovations need to have the consumer in mind.”

Britax Römer provides a range of pushchair options for consumers, innovating to balance performance, comfort and ease of use with personal style. (Image © Britax Römer)

For Britax Römer, the need to combine safety and quality meeting customers’ lifestyle needs leads to an ever-expanding product selection, which requires complete control over data and processes.

“Two of our most popular pushchairs, for example, are extremely different,” Walther said. “The GO range is based on a Scandinavian design with extensive features and provides the ultimate comfort for parents and children. On the other hand, the Britax HOLIDAY is a simple, lightweight buggy ideal for a quick errand or for traveling.


Britax Römer’s “made with care” philosophy dictates strong internal standards for chemical and mechanical testing of fabrics and other components – often exceeding regulatory requirements. Safety concerns also drive the company’s supply chain decisions, and all of its suppliers and materials must be fully vetted.

“The safety of the products is directly linked to the component part quality,” Walther said. “The parts must be manufactured and delivered to the requested, defined quality without deviating from the exact specification.”

Britax Römer builds close relationships with suppliers that meet its high standards. “We primarily work with suppliers with whom we have long-term relationships; they recognize the importance of component part quality required for child safety products,” Walther said. “The majority of Britax Römer car seats and all our car seat fabrics and covers are made in Europe. This is an important step in our commitment to tightening quality control and developing the safest possible products for children.”

Innovating to meet rigorous standards while responding quickly to market trends is a combined effort that requires close collaboration among all departments, from engineering to marketing. Britax Römer uses a sophisticated, end-to-end business collaboration platform to manage the coordination.

“For car seats, for example, we develop the initial concept for each project by combining our know-how from previous products with new ideas and input,” Walther said.“The computer-aided design (CAD) engineers make the first CAD models in cooperation with the design, testing and simulation teams. Popular features from previous products are stored in the system as a reference catalog, and this helps us consistently meet tried-and-true safety standards.“


In the fast-moving consumer goods and retail marketplace, continuous, contextual and customer-centric innovation is a must. Finding smart, collaborative ways to achieve that, as Britax Römer has done, is a prerequisite for future success.

“Speed, insight and agility characterize the business needs of consumer goods companies,” IDC’s Ortis said. “The focus has shifted from selling products to delivering experiences. Consumer product businesses need to engage customers at scale and create a dynamic supply chain that enables rapid innovation to meet changing expectations. This requires a new model of automation and collaboration enabled by technologies such as cloud-based solutions, robotics, artificial intelligence and augmented reality.”

These capabilities, Ortis said, create space where companies and departments can work together and innovate efficiently. “This is where companies will transform their business. It is a fundamental enabler for future opportunities."

For information on managing a supply chain for improved consumer experience, please visit:

Industrial symbiosis

One company’s waste is another’s raw material

Dan Headrick

3 min read

Industry and environment can co-exist, as eco-industrial parks have been demonstrating for more than 50 years. Recently, emerging economies, heightened environmental awareness and advanced digital technologies have created a surge in the development of industrial operations that seek balance with the world around them.

Half a century ago, in the tiny, 900-year-old Danish of Kalundborg, leaders worked out an agreement to build a 13 kilometer pipeline from Lake Tissø to carry untreated cooling water to a local oil refinery. That project led to another: take the refinery’s excess gas, which would have been flared into the atmosphere, and deliver it to a nearby plant for drying gypsum board.

Similar projects followed and, without realizing it, engineers, plant managers and community leaders created a new concept called Industrial Symbiosis. This concept coordinates industry, agriculture, communities and markets to recycle, and share resources, including waste, to improve efficiency, productivity and ecological sustainability.

The concept spread, and new kinds of collaborative efforts emerged. Today, these are generally referred to as eco-industrial parks (EIPs), where different companies running various operations cooperate with each other on shared property. This cooperation extends to government and local communities.

The result: companies operate more efficiently, resources are better managed and communities thrive economically while preserving clean air and safe water. One company’s waste is another’s raw material; resources are balanced in a circular economy: heat, water, waste, steam, gas, electricity, roads, rail lines, barges, pipelines, buildings, facilities management, emergency response, logistics, security, regulatory compliance, capital planning, and community engagement are coordinated.


The number of EIPs is spiking. In 2000, fewer than 50 existed; today, more than 250 are identified, according to a 2018 report from World Bank Group, United Nations Industrial Development Organization (UNIDO) and Deutsche Gesellschaft für Internationale Zusammenarbeit (GIZ) called “An International Framework for Eco-Industrial Parks." Each is unique.

The Miramar EIP in El Salvador has about 500 employees from 11 different companies working in electronics assembly and plastics manufacturing. The Tianjin Economic Development Area site in northeastern China includes about 500,000 employees across 10,000 different companies on a single site that includes its own seaside port, schools, football stadium and residential areas.

Some EIPs are decades old and being retrofitted with green technologies for modern processes. Others are more recently minted. Some have developed at the community level, others through heavy government support. What they all have in common is a commitment to coordinate global operations for shared economic and environmental benefit.


In response to the rapid growth of EIPs, the World Bank Group, UNIDO and GIZ in December 2017 launched the first joint framework that defines minimum parameters for EIP performance. The framework was written because more governments and industries want to create their own EIPs.

“Governments are coming to us asking what an eco-industrial park is," said the World Bank Group’s Etienne Kechichian, a principal author of the report who leads the World Bank Group’s Climate Efficiency Industries effort. “Korea, Japan, China. Developing countries play a key role, and there is a lot of collaboration."

Dolf van der Kamp is manager of inspections within SITECH, a company that coordinates services for the sprawling Chemelot EIP in Geleen, Netherlands. Chemelot represents a collaboration of about 30 separate industrial companies that produce a variety of petro-chemical, specialty chemical and plastics products, coordinating a wide and complex range of operations to reduce raw materials usage, repurpose waste and maintain optimal operating efficiency.

Over time, different companies come and go, which adds to operational complexity. “Companies change, but product portfolios must stay compatible," van der Kamp said. “SITECH is not only a service organization; customers are also shareholders. We need very good alignment to work as efficiently as possible."


The explosion of digital tools and ever-faster data speeds make increasingly complex collaborative processes possible, said David Aitken, associate director of policy and innovation with The Carbon Trust, headquartered in London, contributing to the surge in EIPs.


The number of EIPs is spiking. According to the World Bank Group, United Nations Industrial Development Organization and Deutsche Gesellschaft für Internationale Zusammenarbeit 2000, fewer than 50 existed; in 2018, more than 250 are identified.

“The move to Industry 4.0, or the fourth industrial revolution, is an important development and part of the reason for this growth," Aitken said. “New monitoring, communications and analytics technologies are emerging. These have the potential to improve industrial productivity and resource efficiency, which are key drivers for EIPs. Environmental protection, climate change and resource efficiency are making the imperatives for EIPs stronger."

For example, developments in modeling and simulation software make it possible for plant operators to create complex “virtual plants" that not only measure and monitor what’s happening in the actual plant but calculate process controls and what-if scenarios for planning, management and decision-making by plant operators.

Continually improving digital modeling and management tools also empower plant managers to meet increasingly strict and frequently changing regulatory requirements.

“It makes it easier to meet the requirements," said SITECH’s van der Kamp. “With so many companies working together, you can coordinate with government and international structures. If you have all stakeholders separate, this will be impossible."

For information on EPU solutions, please visit:

Oil & Gas field ‘digital twins’

Intelligent digital models empower real-time decision making

3 min read

The upstream oil and gas industry is pushing to apply digital technologies to exploration and production practices, yielding better business returns by optimizing processes and increasing efficiency. From replicating a complicated well design to refining a well control system to sharply decreasing the risk of a blowout, the ‘digital twin’ is one more concept advancing the industry’s digital transformation.

Since production of the Boeing 737 aircraft began in 1967, the time needed to complete construction of a single jet has gone from several months to just eight days. How did Boeing do it? In many cases, the company applied sophisticated computer software to the challenge of improving its manufacturing processes, organization and efficiency.

Today, the oil and gas industry is applying similar digital solutions to reduce operational risk and ensure premium performance throughout the life of its assets. Specifically, these solutions monitor, model and view real-time oilfield operations above and below ground as “digital twins,” sophisticated 3D simulations that help operators visualize and understand what the huge volumes of data generated by ongoing oil and gas projects really mean – and how to act on it.

“There are multiple advantages to having an accurate digital model of your business,” said Nabil Kassem, chief executive officer of ExcellenceO2, a management consulting company based in Dubai. “The main advantage is that a good model enables you to make critical decisions about your oil or gas field quickly and without the confusion that can arise from having several experts weigh in on the same question from different points of view. The result is faster, better-informed decisions about your asset.”


According to a recent report by global management consultancy McKinsey, the typical offshore oil and gas platform “operates at approximately 77% of its maximum production potential” – a figure that implies an industry-wide
performance gap of some US$200 billion (€170 billion). The reason? Operators are not taking advantage of currently available “analytics tools and techniques capable of unlocking the production potential of complex process facilities and enhancing asset investment returns.”

Kassem points to field development planning (FDP), the stage at which a company plans an oilfield project, designs the project’s wells and considers how it will produce the oil, as a likely point for digital modeling to help close
this gap.



“Most decisions at this stage are about getting the size of the operation right – not too big, not too small,” Kassem said. “If flawed analysis leads to oversizing a project, your capital expenditure can suffer. Likewise, undersizing a project can engender painful outlays, since expanding a project is much more expensive than getting it right the first time. This is why having a digital model is so important. All available information is made accessible so the right decisions can be taken and so potential missteps can be avoided.”

Digital modeling – including the use of digital twin simulations – could provide a major, difficult-to-achieve economic benefit, experts say: the ability to increase an oilfield’s longevity by ensuring optimized operations.

“Today’s oilfield is in information overload,” said Indy Chakrabarti, senior vice president, E&P Software, Emerson Automation Solutions, a developer of software-enabled solutions for the oil and gas industry. “This means operators can’t optimize their decisions across an entire field over time because there is too much data to be considered. Instead they must focus on one well at a time, which can create problems. Much more ideally, you would be able to control all your wells in unified fashion and for the entire life of the field.”


Managing an oilfield without digital modeling is like trying to follow a route with road signs instead of a GPS, Chakrabarti said. “If you’re going only three miles, you’ll be fine with just signs,” he said. “But if you’re going three thousand, you’ll save a lot of time if you know about any traffic problems or road detours well in advance.”

A coordinated digital platform offers benefits well beyond field optimization, Chakrabarti said.

“Health, safety and environmental performance all stand to improve as a result of increased access to digital information,“ he said. “With this type of system in place, any KPI [key performance indicator] a company wants to measure can be recorded and updated on demand. That’s added value.”

For information on expanding the manageability of oil and gas fields, please visit:

Building skills for the future

Micro-credentials enable rapid reskilling of workers

Jacqui Griffiths

4 min read

Technological, demographic and socio-economic changes are transforming industry and creating a continual demand for new skills from tomorrow’s workforce. Educators, businesses and governments are creating programs and credentials designed to answer those needs.

For Jennifer Vandiver, a secondary science teacher at Collinwood High School in Wayne County, Tennessee, personalized learning is important for the development of educators and the students they teach.

“In many districts, teachers are provided the same professional development regardless of their subject area, grade level or personal goals,” Vandiver wrote in a blog for Digital Promise, a US nonprofit organization whose mission is to improve learning opportunities through technology and research. “The irony of this is that many of us educators have been successfully navigating the waters of personalized learning for our students, but haven’t been given the opportunity to transition that same approach to our own professional learning.”

Micro-credentials – short, focused, certified courses to develop individual skills – provided a path for Vandiver to achieve personalized learning. Having searched online for professional development opportunities, she joined the Tennessee Department of Education’s Micro-credential Pilot Program.

“I successfully completed three micro-credentials during the first phase: Wait Time, Idea Generating, and Design Thinking and Doing,” Vandiver wrote. “I then agreed to become a virtual community facilitator for the second phase, which allows me to support Tennessee educators who are beginning their micro-credential journey.”



In an era when work and the educational requirements for doing it are changing so rapidly, experts in both education and business agree that micro-credentialing is a promising way to help workers remain employable for the long run by continuously updating their skills.


Micro-credentials provide an answer to the need for rapid reskilling, certified by digital badges that embed evidence of the skills each individual has mastered.

“Micro-credentials are ideal for enabling learners to master discrete sets of desired skills,” said Mark Leuba, vice president of product management at US-based nonprofit learning consortium IMS Global, which oversees the Open Badges micro-credential standard OBv2.

“For learners and educators, OBv2 provides definition and communication of verifiable achievements and skills,” Leuba said. “For employers, it provides ways to match candidates’ skills to available opportunities. And for administrators, it provides better recording, planning and management of credentials for individuals. Crucially, the standard ensures interoperability, so learners can transfer their badges to different forums as they search for jobs, change employers or apply to educational institutions.”

Leading global companies have been quick to recognize the standard’s value. “Microsoft Worldwide Learning is using OBv2 for its unified framework for credentialing and certification,” Leuba said. “IBM has issued micro-credentials to hundreds of thousands of learners worldwide who have demonstrated skills, often in IBM-related products and services. Now it is locating qualified staff and contractors for projects over the internet based on micro-credentials incorporated in Open Badges. This new service, called IBM Talent Match, is an excellent example of what’s possible.”


“Micro-credentials provide students with ways to learn, and to keep learning while they go out and apply what they’ve learned,” said Cali Morrison, assistant dean of the School of Continuing and Professional Studies at American Public University System (APUS). “Technology gives us the opportunity to tie evidence to the micro-credential, so employers can see what the students did to earn it. As a result, learners feel the micro-credential makes them more marketable, and we are seeing growing acceptance of these sub-degree credentials among employers.”

APUS, founded in 2017, is designing credit-bearing micro-credential stacks that allow learners to earn career-relevant, cross-disciplinary certifications alongside their degree studies. In addition, it offers non-credit-bearing micro-credentials in skills such as coaching and mentoring.

“We want to extend the institution’s offerings to reach more lifelong learners so they continue to build their skills and knowledge through everything from webcasts to micro-credential certificate programs,” Morrison said.


Micro-credentials play a key role in enabling workers to build stacks of skills that meet the changing needs of industry. Businesses and educators are embracing this flexible and focused learning. As momentum builds, more widespread recognition is sure to follow.

“We need recognition that there is value in all types of post-secondary education, whether it’s vocational, academic or practical,” Morrison said. “Stackable credentials are what’s next. They will reduce the traditional dependence on credit hours, enabling learning in smaller chunks that can add up to something much bigger.”

That is certainly the case for Vandiver, who finds micro-credentials a continually empowering experience.

“I’ve always believed the best professional learning for any teacher takes place in the classroom – a belief foundational to micro-credentials,” she wrote. “By doing so, my students become a critical part of my micro-credential journey. And what I’ve found is that they love having a role in my learning. They enjoyed seeing me ‘in their shoes’ as a learner, watching me adjust my practice ‘on my feet’ and make my learning visible.

87% of US workers believe training and skills development throughout their work life will be essential or important to career success.”


“I’m excited for the continued growth I’ll get to share with my students through micro-credentials,” Vandiver said. “As is the case with any form of professional learning, micro-credentials may not be the path for everyone, but they have been an invaluable partner on mine.”


Millions of unfilled jobs worldwide attest to the fact that people are leaving school without the skills businesses need, including problem-solving, adaptability, collaboration, leadership, creativity and innovation – or without any way to prove that they have acquired them in the workplace.

As advances in technology transform the way people do business, new skills, including micro-credential and micro-learning programs, and as yet undiscovered needs will continue to emerge.

For information on training and certification through Dassault Systèmes:

The psychology of personalization

Why do people want unique products and experiences?

Rebecca Gibson

4 min read

Modern technologies, including artificial intelligence, machine learning and data analytics, make it easier to create the personalized experiences, products, services and communications that consumers desire. To create a perfect individualized experience, however, companies must first understand why people want personalization.

“Remember that a person’s name is, to that person, the sweetest and most important sound in any language,” American self-improvement expert Dale Carnegie wrote in his book How to Win Friends and Influence People. By using a person’s name whenever appropriate, Carnegie suggested, you could better influence that person’s way of thinking.

As companies use technology to personalize marketing content, services, products and experiences to individual customers’ preferences, Carnegie’s advice has become increasingly relevant. And it’s working – Econsultancy’s 2017 “Conversion Rate Optimization” report found that 93% of companies report having more success in converting prospects into customers when they personalize their marketing. But what makes personalization so effective?

“The reason is simple,” said Jeriad Zoghby, global lead for personalization at Accenture Interactive in Austin Texas. “Customers like to feel unique. They appreciate a company that recognizes and remembers them and wants to make their experience enjoyable.”

“Personalized products and experiences make us feel unique in a sea of sameness,” said Laura Bright, associate professor of communication at Texas Christian University in Fort Worth. “Brands now have so much information about their customers that they can instantly offer exactly what they want when they want it, which consumers like because it’s more convenient and time efficient. This customization and instant access to what they need also allows consumers to feel as though they’re in control.”


Today, however, personalization does more than stroke our egos. It also respects our time.



The modern marketplace is awash in similar products with minor variations; reviewing all of them can be overwhelming. The more a company knows about its customers, the more precise it can be in recommending exactly what each user is likely to want.

In fact, 45% of potential customers have abandoned a website because the amount of information or number of choices was overwhelming, according to the “2018 Accenture Interactive Personalization Pulse” report. A company that knows its customers well, however, can deliver those choices that best match an individual.

“The human brain hasn’t developed to cope with the flood of information and the thousands of mini decisions we’re faced with every day,” said Liraz Margalit, an Israeli-based web psychologist and head of Behavioral Research at global experience analytics company Clicktale. While the internet is responsible for generating much of this overload, “personalization allows companies to present customers with a shortlist of search results, product recommendations or service options specifically tailored to their personal needs and preferences, so they can easily work out what’s best for them.”

Retail giant Amazon, for example, sells more than 562 million products. Its machine learning algorithms proactively display product suggestions personalized to each customer, based on everything Amazon knows about their preferences.


By empowering customers to pick their favorite option from a shortlist of products or services tailored to reflect their interests, brands also are answering another basic human need: the need to feel in control.

“Consumers don’t want brands to define or dictate their experience,” Accenture Interactive’s Zoghby said. “They want to feel as though they have a level of control and autonomy. Rather than predicting what customers want and forcing experiences on them, successful companies anticipate what services or products customers may like and then allow the customer to choose whether or when they want to use those services. These companies are also transparent about the data they want to collect and how it will be used to develop value-added services for customers, and they allow consumers to opt in or out of sharing information. Together, these factors ensure that customers perceive that they’re fully controlling their interactions with brands.”


Despite recognizing that customers want experiences that make them feel important and in control, companies often struggle to develop effective personalization strategies. The fundamental challenge? Companies often fail to understand the motivations behind a customer’s actions.

“Companies mistakenly think they can predict customer behavior by analyzing data about what individuals have done in the past, assuming they will do the same in the future,” Clicktale’s Margalit said. “However, humans are intelligent, complex and emotionally driven, so their decisions are always influenced by multiple factors.

Consequently, companies should apply psychological theory alongside analytics, so they can identify how external factors combine with their customers’ different personality traits, mindsets, expectations and preferences to impact on their decision to buy (or not buy) products. Once organizations truly understand how their customers are likely to behave in any given mindset and situation, they can develop personalized experiences that will delight customers and drive sales.”

Clicktale has worked with Margalit to identify five customer mindsets: focused, mindful, explorative, disoriented and disinterested. In a physical store, a sales assistant can use a customer’s body language to determine what mindset they are in and how receptive they will be to certain personalized services. To do the same online, companies must analyze patterns in customers’ browsing habits.

“Focused customers who have already decided to buy a specific product will likely click on one product and head straight to checkout,” Margalit said. “Mindful shoppers may hover over different items on a web page for several minutes. Disoriented or explorative consumers might flit between multiple pages and several product categories, and disinterested people will leave the site quickly or abandon their basket.”

Once companies learn to recognize certain behavioral patterns, Margalit said, they can determine which type of personalized experiences individuals will find helpful in real time. “For example, they could send a personalized recommendation to an explorative or disoriented customer during the shopping process to help them make a purchase decision.”


Personalization offers significant benefits for consumers, but it also has multiple advantages for organizations that master it.

“The better companies are at understanding why customers like personalization, the better they will be at offering customized experiences for their clientele, and the more customers will trust and embrace their services and products,” Texas Christian’s Bright said.

Humans are naturally happiest when they are recognized and valued as individuals, and when they feel safe and in control of their experiences, Margalit said.

“When companies get it right, personalization answers all of these basic human needs because it enables every customer to feel like they have their own 24/7 personal assistant who can be trusted to help them make the best possible decisions,” she said. “Once customers have experienced this type of personalized assistance from a brand, they won’t settle for anything less.”

Formula for success

Fast, focused learning speeds formulated products from design to production

Jacqui Griffiths

3 min read

Incompatible IT infrastructures, language barriers, different methodologies – these are just some of the many factors that make it challenging to transfer product and process knowledge from development to manufacturing. Compass spoke to Graeme Cruickshank, director of the UK Centre for Process Innovation’s National Formulation Centre, about the secrets behind getting great formulations to market.

COMPASS: Why do organizations find technology transfer difficult, and how does the National Formulation Centre help to overcome these challenges?

GRAEME CRUICKSHANK: Joining up the knowledge between different organizations or divisions within a large corporate company can involve teams in different geographies, with different cultures and speaking different languages. We need to find a common ground to understand where each side is coming from initially, or misunderstandings and confusion will stop any meaningful progress.

Even within the same organization, software systems often don’t talk to each other. Cost reduction pressures mean that companies don’t want to replace the entire system. Even if they did, they would still face the challenge of communicating with the systems of external organizations.

In principle, these entities often need to join up the systems they’ve already got. This is where open-access innovation centers for advanced formulated product design and manufacture can help. At the National Formulation Centre, for example, we bring together industrial knowledge, research and technology infrastructure so that companies can accelerate the commercialization of formulated products.

How do trends like personalization add to the challenges of tech transfer?

GC: Personalization exists in almost all categories, from stratified medicines to the different drinks available in coffee shops or the various powders, liquids, tablets and combinations in the supermarket laundry aisle. As a result, production is becoming more fragmented and scale-up of formulations more localized as companies seek to make smaller quantities to meet increasingly dynamic consumer demand.

Seamless information transfer between R&D systems and multi-center factory systems is more important than ever in this environment, so that knowledge learned in one geography can be applied in another.

Graeme Cruickshank, Director, UK Centre for Process Innovation’s National Formulation Centre

Can knowledge from other sectors help smooth the path to production?

GC: One of the fastest ways to help one sector of the industry is to give it a solution from a different sector. In formulations, most challenges exist across a range of products, but organizations haven’t necessarily realized that. They’ve kept trying to invent their own solutions. For example, issues with a deodorant can be like those in paint formulation – both are about spraying stuff on quickly and evenly, having it dry quickly without feeling tacky, and self-healing to cover the bits the user has missed.In a highly competitive industry, this horizontal technology transfer is popular with organizations because it helps them find solutions while avoiding the intellectual property issues and defense mechanisms of knowledge transfer between competing companies or divisions.

What role does predictive modeling play?

GC: We are developing more predictive tools, we’ve got better data capture systems and we can better harvest our knowledge, so we’re more in control of our product. But there’s still a long road ahead because when you’re dealing with liquids and pastes that flow and mix, and which might be sensitive to time, temperature and process, the results are not easy to control or predict.

Is there a secret to creating a smooth transition from design to production?

GC:The key is to learn small, learn fast and learn thoroughly.Ultimately, successful technology transfer is about sharing the right information with the right people. A lot of the skill is working out which attributes you need to measure and getting good data quality and information-sharing on those. Big organizations are now looking for neutral testing grounds where they can quickly screen through all their technology leads and make objective decisions about which one they should focus on scaling up. There is a limit to how many big, expensive trials you can do and how much you can learn from them. But using a high-throughput robotic system on small-volume samples can generate thousands of data points. When you have that data density and data quality, you can use it to develop predictive models that will help with your next-generation products.

Blockchain and supply chain

Distributed ledger technology simplifies supply chain processes

Charles Wallace

4 min read

Blockchain, the technology behind cryptocurrencies, is beginning to streamline the paperwork-burdened supply chain. By digitally tracking every transaction in a form that, in theory, cannot be forged or altered, blockchain could eliminate the supply chain’s need for every financial trust system ever established, from banks to corporate accounting departments.

Li & Fung, a Hong Kong-based supply chain manager for a number of global brands, tracks the interactions among thousands of suppliers and customers, generating an enormous paperwork burden. But in 2017, Li & Fung experimented with tracking the intricate choreography of customer orders as they moved through the supply chain with a new technology that dramatically improved visibility and reduced the time and effort required: blockchain.

Blockchain is a database distributed over the computers of all of its members or supply chain participants, so the ledger grows stronger as its user-base grows. When iron ore, an ear of corn or a t-shirt is traded between companies that participate in a blockchain, the system records it.

Each transaction creates a unique numerical identifier, which is recorded in encrypted form in a data block. When the next transaction involving that product happens, such as the item being shipped, the data is entered into another block. That block is chained to the first data block. Hence, blockchain.

The distributed computer ledger technology records detailed supply chain information, securely and permanently. For example, it can verify whether a participating supplier in China has enough purple fabric in stock to fulfill an order for soccer team shirts, record when the order is dispatched to the sewing factory and automatically confirm that the shirts have shipped.

Blockchain enthusiasts say that the technology will irrevocably change the way people and companies move goods and money. It has the potential to disrupt many industries where ensuring the trustworthiness of the various parties is critical, including financial services, trade finance, purchasing systems, manufacturing, utility billing, government registries – and supply chains.

International Data Corporation, a global market intelligence firm, estimates that spending on blockchain software will reach US$2.1 billion (€1.79 billion) in 2018, rising at a compound annual rate of 81% and hitting US$9 (€7.79 billion) in 2021.



“Unlike customer relations management software that sits on a company’s own computers, blockchain allows you to automate workflows across company boundaries with your suppliers and customers,” said Shawn Muma, who leads blockchain research at the Digital Supply Chain Institute. The institute is part of the Center for Global Enterprise in New York, and arranged the pilot study at Li & Fung. “As we move forward, you will see that the supply chain is the biggest and best application of blockchain technology.”


Blockchain technology was the brainchild of Satoshi Nakamoto, the pseudonym of the unknown tech genius who invented the cryptocurrency bitcoin. While bitcoin itself has been mired in controversy, businesses realized that the underlying technology might be an ingenious way to track global movements of goods and real-world currencies.



To read the encrypted data, users need a translator called a hashkey. In theory, the data can’t be altered because it’s replicated on each participant’s computer system; a change on one system would be immediately apparent on the others. And because the ledger is protected by advanced cryptography, it’s difficult to hack.

By eliminating the trust issues that gave rise to banks, brokerages and accounting departments in the first place, experts say blockchain has the potential to make all of them obsolete.

“In today’s world, most businesses have lots of data and have to spend a lot of time reconciling that data with other organizations,” said David Dalton, a partner for strategy and operations at business consultants Deloitte in Dublin. “Blockchain allows for multiple business organizations to share data in a way that is safe, secure and immutable.”


In fact, blockchain is so well suited to supply chain issues that A.P. Moller-Maersk, the world’s largest shipping firm, set up a blockchain joint venture with IBM to track containers as they move through ports and customs worldwide in the US$7 trillion (€6.06 trillion) annual global shipping trade. Walmart, America’s largest brick-and-mortar retailer, has established a blockchain to track food items from its thousands of suppliers, allowing it to resolve questions about the origin of food products in a matter of seconds.

Not surprisingly, financial services companies are watching blockchain’s development warily, because they see it as a possible disruptor of their role as trusted middlemen in transactions. For example, when a company receives an invoice from an overseas supplier, the purchasing department must order a bank to send payment in foreign currency. Blockchain can be set up to automatically make the payment when a certain transaction takes place, such as when items are received at customs.

Despite the threat of disruption, financial firms are the biggest investors in blockchain development, seeing it mainly as a way to streamline their back-office paperwork, said William Mougayar, a Toronto-based entrepreneur who recently published a book entitled The Business Blockchain: Promise, Practice, and Application of the Next Internet Technology.

“The killer app of blockchain is moving financial assets,” Mougayar said.


Blockchain now comes in several different implementations, including Bitcoin Ethereum and the open-source Hyper-ledger, which is being adopted by a number of businesses for private use. Bitcoin is a public blockchain, but most businesses probably will want to use a private blockchain that carefully polices data access, said Rob Handfield, professor of supply chain management at North Carolina State University in Raleigh.

Handfield said that blockchain can replace today’s inefficient system of procurement and payment based on purchase orders, invoices and payments for products and services.



Making payments even easier, blockchain makes possible so-called smart contracts. For example, a shipper sends a spare part from China to a factory in the United States. Instead of submitting an invoice, a smart contract automatically pays the supplier when the blockchain records receipt of the part.

“If you have Walmart and Best Buy stores and both are buying from the same set of suppliers, they all can use a common blockchain that allows participation by Sony and Apple and other manufacturers,” Handfield said. Each participant, he said, can specify who has access to what information.

Almost every day brings a potential new use for blockchain, including activities as diverse as tamper-proof voting or collecting airline loyalty points. As with the World Wide Web a decade ago, a Darwinian-process of survival is likely to produce winners, losers and applications for the technology that haven’t been imagined yet.

Until then, blockchain will track a myriad of complex transactions, allowing companies to reduce their paperwork, make payments and streamline in-house departments such as purchasing and supply chain management. Pretty impressive, especially when it can also provide proof that your crucial shipment from China left the dock on time.

For more information on blockchain, go to: http://go.3ds.com/bEnX

Augustine’s laws

Technology makes progress against former Lockheed CEO’s irreverent rules

Tony Velocci

4 min read

Nearly four decades after former Lockheed Martin chairman and CEO Norm Augustine published his iconic book chronicling the aerospace industry’s systemic challenges – struggles common to many industries, in fact – at least five of his observations are giving way to the steady advance of technology.

In the early 1980s, aerospace companies were pushing for increasingly sophisticated systems. But progress came at a price: quality became a recurring problem, and companies struggled to complete projects on time and on budget.

Struck by the escalating cost of jet fighters, Norman R. Augustine, who at the time was an executive with Martin Marietta Corporation (a US aerospace manufacturing company that merged to become Lockheed Martin in 1995), was inspired to write the first of 52 satirical aphorisms about the complexities of managing large aerospace programs. In 1983, he compiled them into a book called Augustine’s Laws.

Almost 40 years later, Augustine is confident that many remain valid. Although he wishes that were not the case, he also doesn’t believe technology can solve all of the ills chronicled by the laws.

Still, while many of the original laws have withstood the test of time, modern technology is making important inroads in addressing others.


An example of a law that would seem out of step with today’s leading software solutions is Number 15, which states: “The last 10% of performance generates one-third of the cost and two-thirds of the problems.” Aerospace customers refer to this conundrum as the pursuit of “exquisite solutions” – ultimate performance, whatever the cost.

Exquisite solutions were common when Augustine compiled his laws, but government customers now push for innovative technology that’s just good enough to meet their requirements affordably. To help the industry achieve this balancing act, many organizations rely on business innovation platforms designed to help engineers optimize designs while keeping a firm eye on costs. Simon Briceno, senior research engineer at Georgia Institute of Technology’s Aerospace Systems Design Lab, has a firsthand appreciation for modern digital solutions’ ability to help engineers balance performance with cost.

“Past methods of building and testing physical prototypes for each design are just too expensive and time consuming,” he said. “We needed a way to reduce the design, integration and testing time of unmanned aerial systems. The way we do this is by creating, analyzing and testing our design and behavior in a virtual environment.” In general, testing in a virtual environment using multi-physics, multi-discipline simulations lowers the cost of testing and certifications by more than 25%, compared to physical tests. Virtual tests also can be completed in most cases in about 30 minutes, compared to two weeks or more for physical tests.


Ask aerospace customers what rivals affordability at the top of their wish lists and they’re apt to say “shorter development cycles.” This goal, however, flies in the face of Augustine’s Law Number 24: “The only thing more costly than stretching the schedule of an established project is accelerating it, which is the most costly action known to man.”

That give-and-take relationship may be changing, however, for aerospace companies that are using the latest state-of-the-art software solutions. For example, at Embraer, a Brazil-based commercial and executive jet manufacturer, about 4,000 members of the company’s engineering, technical and product-support team use a concept-through-manufacturing product innovation platform, as do many shop floor, pre-design and customer service-oriented employees. Adoption even extends into Embraer’s supply chain.

“This has allowed Embraer to shorten the time it takes to shepherd new products through development, from conceptualization to manufacturing, due to improved communications among functional teams, as well as the elimination of some documentation and intermediate steps,” said Humberto Pereira, Embraer’s vice president of engineering and technology.


Augustine’s Law Number 35 addresses another historical industry weakness: its traditional inability to analyze its data to increase efficiency and quality. The law states: “The weaker the data available upon which to base a conclusion, the greater the precision should be quoted in order to give the data authenticity.”

Using the right digital solutions, however, aerospace companies today are getting more skilled at mining big data to improve operating performance in a way they couldn’t decades ago. “The ability to connect a complete suite of digital tools is allowing us to address a range of design and manufacturing problems,” said Roland Gerhards, chief executive of ZALCenter of Applied Aeronautical Research in Hamburg, Germany. “It’s all about big data and making sense of all the information that digital tools provide.”

Indeed, the ability to correctly interpret the voluminous information that digital applications generate also threatens Law Number 37: “Ninety percent of the time things will turn out worse than you expect. The other 10%, you had no right to expect so much.”


At US-based aerospace and defense technology company Northrop Grumman, managers are using a suite of digital solutions to predict performance outcomes earlier, saving time and helping to avoid cost overruns caused by late-cycle discovery of design issues.

“We are incorporating advanced sensors, analytics and IoT (Internet of Things) solutions to provide not just first-time quality of products, but also integrating digital information from key automation assets and business systems to enable unparalleled shop-floor visibility and proactive decisions throughout our factories,” said David Tracy, a vice president of production programs at Northrop Grumman.

Law Number 42, another example of the apparent disconnect between the digital age and what was possible in the early 1980s, states: “Simple systems are not feasible because they require infinite testing.”

While physical testing remains time-consuming and costly, engineers using a business innovation platform can rapidly test an infinite combination of components virtually. Digital testing helps engineers confirm not only which design performs best, but also which design can be manufactured and maintained least expensively.

As David Miller, chief technologist of NASA, puts it: “We’re at the point where 3D models and testing are coming together. Of the things we’re less certain about, we focus on the interplay between the two, using advanced digital tool sets we didn’t have until recently.”

For information on Dassault Systèmes solutions for accelerating time-to-market and reducing costs, please visit:

Capturing customers

Amazon and Alibaba lead the way with experience-centric business models

Rebecca Lambert

4 min read

From car vending machines to checkout-free shopping – businesses are pulling out all the stops to deliver exciting new customer experiences that are more convenient and personalized than ever. Those doing it best are finding new ways to capture their customers’ attention and keep them coming back.

In a Seattle, Washington, convenience store, shoppers pick up snacks, groceries and toiletries and simply walk out with them. No, it’s not shoplifting. It’s the latest Amazon experiment in giving customers what they want – a quick, wait-free, brick-and-mortar shopping experience.

Cameras and sensors track shoppers’ activity as they browse and pick up items, creating a virtual cart. When they walk out of the store, the app calculates their total bill and charges it to their Amazon account.

The concept seems to be hitting the mark with consumers. The store has a 4.6-star rating on a 5-star scale on Google reviews. Customers have called it “the future of retail,” praising the “unique, seamless” shopping experience. “The future of highly advanced, convenient, and affordable grocery shopping is here!” one reviewer posted.

Across all industries, businesses are scrambling to keep pace with rising consumer expectations by developing new customer experiences (CX) that are personalized, innovative and disruptive. In a 2017 Forrester survey, “Pivot to Person-First Personalization,” more than two-thirds of the firms surveyed said that delivering tailored experiences is a priority. The reason is obvious: as Forrester’s 2016 report, “The CX Transformation Imperative,” observed: “CX leaders grow revenue faster than CX laggards.”


One business that understands the importance of putting the customer at the heart of its strategy is German orthotics and prosthetics manufacturer Mecuris, which has digitalized its fitting processes and automated it’s design process. Digitalized designs are output to 3D printers, which make devices custom-designed to each patient’s anatomy and movements – in a matter of hours.

“What we want to deliver, above just mobility, is a custom design,” said Felix Gundlack, the company’s CTO. The result is superior customer experience: prosthetics made to fit and look exactly how the patient wants.

Mecuris CEO Manuel Opitz recalls making a pink prosthetic for a 3-year-old girl and putting a unicorn design on the side. When she put it on for the first time, she said: “This is my foot.”

“In the past, we have tended to hide away mobility impairments,” Opitz said. “[But now] we want to create entirely new feelings about so-called patient aids and use them, more like other devices. We’ve seen this in eyewear. In the past it was something which you didn’t want to show off. Now it’s a statement – an accessory. We want to go the same way with prosthetics and orthotics, so that people are proud of them."

Like Mecuris, business leaders across the spectrum are realizing that new, automated technology means that personalizing at scale has become affordable. Austrian-based global automotive equipment manufacturer 3CON, for example, is using new technology to collaborate more effectively with its customers and simulate customized solutions, cutting design time by 30% in the process.

“Designers now have the freedom to add custom-made individual components to existing part designs to create new ones that are  customized, yet cost efficient,” 3CON CEO Hannes Auer said.


An increasingly popular way to create customized experiences is to involve consumers in designing them, a practice called “open innovation.”

“Open innovation is about involving all of an organization’s stakeholders in its innovation process: employees, suppliers, startups, universities, and customers,” said Martin Duval, founder and co-president of French open innovation consultancy firm bluenove. “Companies are usually involving the customer too late in the innovation process, only when they have almost ended the design of a new product or service, or to get the reaction to a final test. ”If new ideas or objections to the product are discovered at this stage, rather than at the initial concept stage, the cost to address them can be prohibitive.

The payoff is twofold. Businesses are able to solve problems and identify new opportunities and, at the same time, create a more customer-centric culture because the customers are involved in decision-making every step of the way.


Forrester’s 2017 report “Improving CX Through Business Discipline Drives Growth," found that companies with a superior customer experience score grew revenues five times faster than competitors with an inferior score.

In the automotive industry, these centers around loyalty. According to J.D. Power’s annual US Customer Service Index (CSI), customers who are satisfied with their experience are more likely to recommend using a dealer for service or sales; it also affects their loyalty intentions toward a particular brand or model.

“Satisfied customers tend to stay with a brand and bring others with them,” said Chris Sutton, vice president of the US Automotive Retail Practice at J.D. Power. “Anything less opens the door for customers to shop elsewhere.”

Shoppers scan the Amazon Go app on their mobile devices as they enter the Amazon Go store, on January 22, 2018 in Seattle, Washington. After more than a year in beta, Amazon opened the cashier-less store to the public. (Image © Stephen Brashear / Getty Images)


To keep customers coming back, companies are finding new ways to improve the experiences they offer. For example, French grocery retailer Intermarché and cheese manufacturer Savencia recently surveyed customers and discovered that they would prefer to shop for cheese according to how they would actually use it, as opposed to its type – hard cheese, soft cheese and so forth. Working together, they planned out new display strategies.

“We’ve had really positive customer feedback,” said Marie-Noelle Lefebvre, manager (franchisee) at Intermarché Buc. “And when we looked at the figures, we saw a 4% sales increase. This is pretty good at our point of sale, especially as we have many fresh food and dairy products.

”Other businesses, including online marketplace Alibaba, are building new businesses from scratch. Alibaba’s “New Retail” concept – using technology to seamlessly merge the online and offline shopping experiences – aims to consistently engage the customer at every step of their journey."

Current consumers live very rich digital lives, but they also require a lot of the experience in the offline world,” said Jason Ding, a partner at consulting firm Bain & Company’s Beijing office and author of a joint report with Alibaba’s AliResearch unit on New Retail. “Brands need to follow the consumer to really provide an integrated experience."

In Guangzhou, China, Alibaba unveiled a new kind of car shopping experience in partnership with American carmaker Ford. Customers can browse more than 100 car models on an app, choose one they want to test and then pick it up from a giant multi-story vending machine, which dispenses their chosen car in less than 10 minutes. They can drive it for as long as to three days before making a purchase decision.

In the future, bluenove’s Duval believes that progressive businesses will go beyond shopping experiences to tackle social issues in partnership with their customers, as well. “The next thing will be for brands to open democratic debates through a massive collective intelligence process, to mix their ability to merge customer needs with citizen aspirations,” he said. “The most innovative products and services will also be the ones which are most in line with our social issues.”

Discover more information on new customer experience in transportation and industrial equipment:

Tech trends in health

Five top technologies advancing personalized health

Charles Wallace

5 min read

Technology gives patients and physicians earlier insights and innovative treatment options for personalized health care. Five key technologies are: 3D modeling, 3D printing, artificial intelligence (AI), combination therapy and innovation platforms.

Technology is increasing the control that individuals have over their health, giving patients and physicians earlier insights and innovative treatment options. Compass looks at five top technologies driving the personalized health trend, from 3D modeling of human physiology to innovation platforms that streamline research from concept to cure.

For generations, health care has been characterized by one-size-fits-all medicine. From prosthetics to drugs, treatments were designed to work for the largest possible population. Patients who fell outside the norm were out of luck.

Rapidly advancing technologies are changing that paradigm, however, leading to an era of “personalized health” – using a patient’s own biology, lifestyle and environment to anticipate illness, customize treatments and choose the best therapies to improve outcomes and minimize side effects.

“Technologies such as 3D printing, simulation and ‘digital twins,’ and artificial intelligence are at the forefront of the Precision Medicine revolution,” said Daniel R. Matlis, president of Axendia, an analyst firm focused on life science and health care trends and insights. “Leveraging integrated platforms, life sciences and health care companies are using these technologies to accelerate the delivery of next-generation products that are personalized, more precise and higher quality than ever before, resulting in improved patient outcomes.”

Five of the top trends advancing personalized health include:


A cerebral aneurysm, a ruptured blood vessel in the brain, is a life-threatening medical condition. Although the rupture can be seen on an angiogram, surgeons don’t fully know what to expect until after the operation begins. That precarious situation is changing, however, thanks to innovations that create a computerized “digital twin” of human organs.

Biomodex, a French-American medtech company, uses software to convert medical imaging, including CT, MRI and Echo scans, into precise digital models. The software models the exact twists and turns of the patient’s blood vessels, then adds the patient’s biomechanical properties into the digital model so surgeons can accurately plan the operation.

In addition to converting scans into lifelike digital models, Biomodex 3D prints an exact replica of the aneurysm. By producing a custom, virtual model that captures the complexity of the patient’s own brain vessel, Biomodex helps the surgeon know exactly what to expect – and even practice the operation in advance.

“Our vision is to drive more decisions before the operation in a safe and replicable environment instead of doing this during the operation on the patient,” Biomodex CEO Thomas Marchand said. “It makes a big difference.”

The Biomodex offer is a major breakthrough. “Aggregation of digital twins with real-world evidence enables the evaluation of health care scenarios ranging from quantum mechanics to organ simulation to population health modeling that would be impossible to test in the real world,” he said.

The result is nothing short of revolutionary, said Sandra K. Rodriguez, market analyst at Axendia.

“The use of digital twins and virtual human models to support value-based care may sound like science fiction,” she said. “But companies like Biomodex are leveraging game-changing advancements in science and technology to turn them into science fact.”


3D printing is another breakthrough technology, enabling production of lightweight, affordable prosthetics customized to each patient’s need.

Easton LaChappelle is the founder and CEO of Unlimited Tomorrow, which specializes in an “ultra-realistic, 3D-printed robotic hand that can be used as a prosthetic.” The company provides families of people who need prostheses with portable 3D scanners, which capture a digitalized model of the remaining limb. Unlimited Tomorrow then uses the data to create a mirror-image prosthesis so accurate, it comes with freckles and paintable fingernails.

Paired with computerized sensors and controls, the limbs are capable of gestures such as gripping. Traditionally, such prosthetics have cost US$100,000 (€84,762) or more, putting them beyond the reach of many and making them impractical for fast-growing children. Thanks to the software’s ability to automate the creation of a custom-designed 3D model and then 3D-print a personalized limb, Unlimited Tomorrow’s prosthetics will be available at a starting price of about US$5,000 (€4,238).

Byung Wook Kim won the bronze medal in the power exoskeleton event at Cymbathalon 2016, wearing the WalkOn Suit developed by SG Robotics and Sogang University. (Image © SG Robotics)

SG Robotics, a South Korean company, is applying similar technology to custom-designing intricate exoskeletons that allow paraplegics to walk again. “Making physical mock-ups is very time consuming,” said Na Byung Hun, a designer at SG Robotics. “We use our software to visually prototype parts before we physically assemble them.”


Artificial intelligence (AI) is another trend in personalized health care. Consultants Frost & Sullivan estimated the global AI market in health at US$633 million (€541.2 million) in 2014, but projects it will grow to US$6.6 billion (€5.64 billion) in 2021 – a compound annual growth rate of 40%.

Meta, a Canadian company, uses AI to read and understand the thousands of scientific papers issued each week, then analyzes them to provide researchers with insights.

British pharmaceutical firm GlaxoSmithKline (GSK), meanwhile, is using machine learning, a specialized form of AI, to develop medicines more quickly. With genetic information on 500,000 volunteers who donated their DNA to the UK Biobank, GSK is using AI to model and predict every gene’s effect on disease. Identifying patterns of gene settings in people with specific conditions will give GSK a roadmap to which genes to turn on or off, allowing it to design new treatments more quickly.

Karenann Terrell, who manages GSK’s digital, data and analytics strategies, predicts that machine learning will make it possible to reduce the development time of new drugs from about five years to around one year, according to the website AI Business.

“Axendia’s research shows that life science companies are drowning in meaningless data,” Matlis said. “The industry is obsessed with collecting, retaining and hoarding data to meet statutory and regulatory requirements. Unfortunately, most companies do not harness them.” As GSK is demonstrating, however, “AI tools can help distill value from these data to speed time-to-market while improving quality and patient outcomes.”


Sometimes, a drug works better when combined with other substances or is easier to take when delivered with a unique device or packaging, an approach known as combination therapy. For example, clinical trials show that treatments that combine aspirin, blood-pressure-lowering medication and a statin into a single pill reduce the risk factor for heart disease while increasing patients’ adherence to their dosing schedules. Combination therapies come in many forms, combining two or more drugs, biologics or medical devices to deliver a treatment.

The Biomodex EVIAS (endovascular intracranial aneurysm solution) models a specific patient’s aneurysm, allowing detailed advance planning of surgery. (Image © Biomodex)

Portal Instruments, a medical device firm based in Cambridge, Massachusetts, has developed a novel solution for administering large-molecule treatments: a needle-free injector that uses precise, high-pressure technology to administer a stream of medicine, thinner than a human hair, directly through the skin. Japan’s Takeda Pharmaceutical Company has partnered with Portal to use its injector for administering Entyvio, a monoclonal antibody for adults with ulcerative colitis.


Platforms are well known as a way to book a car-sharing service or download music, but they’re also speeding the development of many personalized health innovations. Cloud-based platforms, for example, work across disjointed legacy systems, eliminating information silos to accelerate problem-solving.

Because they can be accessed by authorized users from any internet-connected device with a simple web browser, cloud-based innovation platforms also enable collaboration throughout an organization and with multiple external partners.

In planning brain aneurysm surgeries, for example, it wasn’t enough for Biomodex to model the patient’s damaged brain artery. To guide the repair of the rupture, Biomodex also had to develop a way to convey how the artery actually worked with blood flowing through it.

Biomodex developed an algorithm, called Invivotech, which combines patient information, diagnostic data and physician input, all of which is automatically gathered, organized, analyzed and modeled on the company’s collaborative platform. The result: a 3D-printed model that mimics the in vivo mechanics of organic tissues in a matter of hours, not weeks or months.

“It’s the positive disruption of health care,” Matlis said. “Innovative technologies are enabling the Precision Medicine revolution, resulting in improved health outcomes for all of us.”

For information on Personalized Health, please visit:

Stay up to date

Receive monthly updates on content you won’t want to miss


Register here to receive a monthly update on our newest content.