Self-organizing circuits

In the race to replace silicon, carbon nanotubes get a big boost

Dan Headrick
26 October 2013

5 min read

Carbon nanotube technology shows promise as a potential replacement for silicon in the fabrication of semiconductor chips, but it is not the only technology vying for that honor. With silicon approaching its physical limits, the desire for alternatives is high, and investors everywhere are eager for a front-runner to emerge.

Since October 2012, when a team of IBM researchers announced its success with a new technique for building semiconductor circuits from carbon nanotubes (CNTs), technology watchers have been abuzz about the long-anticipated demise of silicon. Still, industry and investment analysts have yet to pick a clear winner.

“The prospects are certainly there for nanotubes,” said Pallavi Madakasira, who leads energy electronics research at Lux Research, a Boston, USA-based global analyst firm focused on emerging technologies. “Silicon clearly is reaching the limits of its performance. IBM’s approach has promise, but it needs to be transferred to incumbent solutions for manufacturing, leveraging existing equipment being used now in semiconductor fab facilities.”

Investment in nanotechnology, a broad industry segment that includes nanotubes as well as other nano-materials, is robust and gaining ground. Lux Research estimates that global sales of products containing some nanotech components will reach US$2.4 trillion by 2015. Government investment in nanotechnology is up worldwide, according to analysts at London, UK-based Cientifica, an analyst firm specialized in nanotechnology, new materials, life sciences and big data. The Japanese and US governments lead the world in nanotech funding, Cientifica found, followed by the European Nanotechnology Trade Alliance (ENTA), China, Singapore, South Korea and Taiwan.


Silicon is cheap, abundant, easy to work with and, for the past half century, the material heart of the digital world. Since 1965, when Intel co-founder Gordon Moore posited “Moore’s Law,” which predicted the number of transistors that could be packed onto a single silicon chip would double every 18-24 months, the industry has surprised even itself by actually achieving it. Transistors have shrunk so much, in fact, that they are approaching dimensions measured in atoms.

“If the stars align, the benefit is energy-efficient computation.”

Wilfried Haensch
senior manager of physics and materials for logic and communications at IBM, on the potential of nanotubes

While the size of transistors continues to decrease, silicon’s ability to conduct electricity at such tiny sizes is nearing its limit. Today’s silicon transistors need about 1 volt to change a bit from 0 to 1. Switching creates heat, and heat limits how densely structures can be squeezed onto a chip. This is why clock speed – the time it takes a silicon transistor to flip a 1 to a 0 – essentially peaked a decade ago.

To compensate for stalled clock speed, engineers developed parallel computing – “quad-core” smartphones, for example, with four processors that separate and process tasks simultaneously. But soon, experts agree, silicon will reach its limits, helping to explain the excitement surrounding IBM’s announcement, published in the journal Nature Nanotechnology, that its researchers had demonstrated the practical viability of carbon nanotubes as an alternative to silicon.

The IBM team at the T.J. Watson Research Center in Yorktown Heights, New York (USA), was the first to precisely arrange CNTs in trenches etched into the surface of a silicon wafer, using them to build chips with more than 10,000 working transistors. That may not seem like much, given that a silicon chip today can contain more than 1 billion transistors, 100,000 times what IBM achieved with CNTs. It proves, however, that nanotubes can be arranged into circuits, offering the potential for processors to become smaller, consume less power and generate less heat, since CNTs conduct electricity with less resistance than silicon.

“Right now, to replace silicon, this is the leading edge,” said Wilfried Haensch, senior manager of physics and materials for logic and communications at IBM, who describes his role in the research as “a guiding hand in leading the team to develop something that can eventually merge into technology.”


CNTs are formed by rolling a sheet of individual carbon atoms, arranged in a hexagonal matrix, into a structure that resembles a microscopic roll of chicken wire. CNTs have impressive semiconducting properties and transmit electrons efficiently, but arranging the tubes into perfect circuits on the surface of the chip has been a daunting challenge.

The team of eight researchers at IBM developed a chemical “self-assembly” process, the first success in arranging the tubes in a predetermined pattern on the chip. IBM’s complex, multi-step process involves an intimidating vocabulary of molecular chemistry, but works on basically the same electrostatic principle as a laser toner printer. “It’s actually very simple,” Haensch said. “It’s amazing that it works, but don’t be fooled. You have to have the right molecules. That’s where the secret sauce is.”

With 10,000 working transistors, IBM achieved a functional chip density of 1 billion nanotubes per square centimeter. While that density can’t approach what is needed to replace traditional silicon, it is 100 times better than any previous attempt, creating optimism that silicon’s replacement is within striking distance. “If the stars align,” Haensch said, “the benefit is energy-efficient computation.”

Artist’s rendering of a carbon nanotube (CNT) Field-Effect Transistor (FET) with a bottom gate (where current is modulated). The red coils symbolize the CNT’s contacts. (Image courtesy of IBM)

IBM is now focused on refining the purity of the carbon nanotube material to optimize its semi­conducting properties and develop methodologies to verify its purity. The required material purity is one metallic CNT per several hundred thousand CNTs assembled into a circuit. The bottleneck today is to verify material purity. “It’s not hard to build devices and measure their properties,” Haensch said. “However, it is very time-consuming to measure the large number of devices — several million — needed for these purity levels. It would be desirable to have a simple shoot-and-click method to obtain the purity level, but this doesn’t exist now.”

The team also must refine the accuracy and precision of its nanotube placement methodologies. Haensch’s target is to develop new device technology that functions in a nano-systems environment by 2020.


Silicon hasn’t reached the end of its road, however. Transistors are still getting smaller, and chipmakers have worked out the next few generations of silicon chips. “Silicon technology doesn’t sleep,” Haensch said. “It will advance, too.”

For example, a team of researchers at the University of New South Wales has supported its prediction that silicon transistors will reach atomic scale before 2020. Global chip-maker Intel is working on its smallest transistor – 14 nanometers – a scale that is close to the manipulation of individual atoms.
David Carroll, director of the Center for Nanotechnology and Molecular Materials at Wake Forest University in Winston-Salem, North Carolina (USA), applauds IBM’s research, but is cautious in his expectations. “It’s a wonderful milepost along the way, but there’s still a lot to be done,” Carroll said. He predicts it will take decades for nanotube chips to become commercially available, even if all of the technical hurdles can be overcome.

In fact, Carroll said, CNTs may not be the technology that ultimately offers the first commercially viable replacement for silicon. Nanotubes weren’t even mentioned in 2013 presentations at the prestigious Kirchberg Winter School in Austria, for example. Instead, graphene was discussed as the new “super material.” Other intriguing possibilities include silicon photonics, which transmits information with light. Meanwhile, research continues into DNA computers, which take advantage of the many different molecules of DNA for solving specialized problems. Researchers also are working on what Einstein called the “spooky” science of quantum computing.

Lux Research analyst Madakasira, meanwhile, is excited about gallium nitride, a hard and stable material with high heat capacity and thermal conductivity that has been used in LEDs (light-emitting diodes) since the 1990s.


As the future of silicon technology shrinks, the range of new applications expands. Carroll points to research into organic devices used to repair human organs, and neuro-networks that allow people to control prosthetics or even keyboards with their brains. In fact, Carroll’s NanoCenter recently announced that its researchers had developed a new fabric that generates power from the heat and motion of the human body.

What really excites the market even more than CNTs, Carroll believes, is the benefit of having options. “The urgency that you hear is translated in the market that we need a new material,” Carroll said. “But even that’s a little bit of a misunderstanding. What you’re hearing is the desire for continued growth in new electronic applications.”

Uncertain times

Mine owners focus on technology to control costs

Lisa Rivard

3 min read

Today’s economic environment has fundamentally altered commodity demand and prices, making it more difficult for mining companies to predict future patterns. Although long-term demand for natural resources, particularly by industrializing nations, is expected to rise, short-term pressures are forcing companies to examine their operations.

During the mining boom of recent years, mining companies focused on getting as much material out of the ground as quickly as possible; closely monitoring costs and overall productivity was not a priority. The result: the mining industry benefited from rising output, but didn’t reap the marginal profit growth that should have come with it.

“Mining was a low-growth business for much of the 20th century, so we were caught off-guard by the pace of China’s early 21st-century urbanization and industrialization,” Andrew Mackenzie, CEO of global resources company BHP Billiton, said in a June 2013 speech to the Melbourne Mining Club. “Demand was met in part by higher cost – much higher cost – operations. And many invested poorly to the detriment of their owners. Finding five dollars of savings per metric ton did not seem as pressing when prices were sky­rocketing. But it really matters now.”

With the “low hanging fruit” of the easiest-to-mine deposits gone, companies are forced to enter more remote and costly-to-develop regions. Labor issues, favorable political environments, financing, and monitoring and safety challenges all become more difficult and expensive in far-flung locations.


In its recent “Tracking the Trends 2013” assessment of the mining industry, global consulting firm Deloitte reported that costs are reaching “unsustainable highs, and unless companies improve operational efficiency, proactively control maintenance costs and invest in cost-reducing technologies the trend is likely to continue.” The tough conditions and challenges faced in many mining operations “mandate a level of analytical capability that many companies lack,” the Deloitte report stated. “Significant rewards will be available to companies that invest today.”

The call to action to invest in new tech­nologies is not falling on deaf ears. “In recent times, mining companies have been making money no matter what they do, but that’s quickly changing,” said Chris Holmes, Head - International at IDC Manufacturing Insights, a global research and consulting firm. “A fundamental rethink is happening on the way these organizations are run. There is a focus on productivity and efficiency that leads to a discussion on technology, including project management and supply chain solutions to complex communication and simulation tools.”


assistant professor of resource engineering, Department of Geosciences and Engineering, TU Delft University

Mark Cutifani, CEO of British multinational mining company Anglo American, advises the mining industry to look to other industries, such as petroleum, aviation and manufacturing, for techno­logies that have been used to address these needs, including design, mapping, modeling and simulation solutions. Cutifani told shareholders at his company’s April 2013 annual general meeting that “while many things have been achieved, we cannot continue to do business as usual. Our share price is languishing compared to our peers, and we are not being rewarded for the potential we have embedded in the asset portfolio.”


From site modeling and supply chain optimization to automation of mine operations, maintenance monitoring and strategic planning, the opportunities for gaining efficiencies and reducing costs are numerous.

“We must be able to set up mines and logistic systems in ways that we are able to react to changes in demand and reduce risks and uncertainty,” says Dr. Jörg Benndorf, assistant professor of resource engineering in the Department of Geosciences and Engineering at the Netherlands’ TU Delft University. “Setting up mining projects takes 10-15 years with billions invested. You can’t just react to the market; you need the proper strategies to justify the investment and implement projects successfully.”

Some mining companies are turning to experts from other industries to lead the charge. “A couple of natural resources companies have brought in supply chain managers with retail and consumer goods backgrounds to implement some of the latest supply chain process thinking and supporting technologies,” IDC’s Holmes said. “And that’s a trend I think we’ll see continue.”

Cutifani encourages the mining industry to develop “an operating and project-delivery model that takes inspiration from beyond the mining industry, to support the implementation of the plans and disciplines necessary to improve our ability to execute our strategy. In doing so, we will also establish the underlying processes necessary to improve our competitive operating position, to improve margins and to increase capital returns.”


The road ahead isn’t necessarily an easy one for the mining industry. The biggest challenge, Holmes and Benndorf agree, will be tackling change.

“Mining is a very conservative industry, and implementing change usually takes a while,” Benndorf notes. “The right driver has been missing because mining has been such a lucrative business. But now we’re being forced to look at cutting costs. Smarter ways of doing maintenance, mine planning and implementing production become more appealing to companies and they are more willing to change in this situation, but it requires a commitment from the top.”

Holmes believes a change in culture is a prerequisite for technology adoption. “We need to start talking about lean techniques in mining as other industries do,” he urged. “It’s important for mining organizations to look at other industries to see how they’ve embarked on that journey. We need a very fundamental change in thinking and a cultural change that is organized from the top down so that things get implemented.”

Market dominance

Success at New Product Introduction also builds consumer loyalty

Dan Headrick

7 min read

From early concept to distribution, high-tech companies choreograph dizzying volumes of information and actions to deliver new products to consumers on ever-shrinking schedules. To manage the New Product Introduction (NPI) challenge, high-tech manufacturers are adopting new tools and processes and developing new ideas about the very nature of their organizations.

Consumers rarely think about the complex New Product Introduction (NPI) process that brings amazing electronic products to retailers’ shelves every few months, but the high-tech companies that produce those devices fight relentlessly for every possible advantage in the field.

As devices become more complex – combining advanced electronics, software, and mechanical design – and development cycles become shorter, the potential for errors that can delay an introduction or kill a product in the marketplace become greater.

Michael Zapata, a technology industry entrepreneur and executive board chairman of Protochips, which develops analytical tools for nano-scale materials research and development, highlights three elements that high-tech companies must master to succeed at NPI:

1. Tools, processes and methodologies, which represent “the table stakes” to even play in the game.
2. A corporate culture of innovation exhibited at all levels of management.
3. The right people in key positions who know how to champion and promote NPI.

A company can get the first two right and still trip with the wrong person in a key leadership post. “If he’s talking about processes and bureaucracy, he’s just a manager,” Zapata said. By contrast, a leader will talk about the market, the risks, and about translating ideas into action.

Michael Keer, founder and CEO of California (USA)-based Product Realization Group (PRG), which provides NPI consulting to high-tech companies, agrees. “The tools available out there are vast,” Keer said. “Unfortunately, the people available with the skills to leverage them are limited. So much depends on a company’s management team and culture. Those companies that invest in operational efficiency early in the business cycle tend to be more successful. Those that focus on marketing or technology without investing in a strong NPI process tend to be weaker.”

Typical warning signs of NPI weakness include a lack of management understanding or support; insufficient capital and resources, including expertise; unrealistic schedules; designs that are not scalable; faulty or incomplete testing; incorrect tooling for manufacturing; and poor communications, according to a 2012 white paper Keer authored titled “New Product Introduction (NPI): Seven Best Practices.”


Sanjay Keswani, CEO of Consensia, a Silicon Valley software firm that helps high-tech companies manage their NPI processes, counsels executives who face any of these challenges to step back and take a holistic view of their organization. With so many disparate and simultaneous activities required to bring a new product to market, it is easy to lose sight of the big picture.

“You need a really good cross-functional team with a strong leader,” Keswani said. That leader, overseeing a strong team responsible for a vast array of disciplines and expertise, needs to be able to stay focused on a very basic question: “How do I take all those relevant ideas and get something that addresses the need of the market?” To ensure success, Keswani urges NPI leaders to:

• Create what the industry refers to as clear “phase-gate” release processes that allow managers and teams to assess progress along the way. At each gate, key questions can be examined. For example, were previous steps properly executed? Does the business rationale still hold? Are resources still available? At each gate decision makers can decide to go, kill, hold or recycle a project.

• Limit risk every step of the way. “That comes with experience,” Keswani said. “Consider the challenge of changing specifications after they have been approved. It’s OK to change, but you’ve got to manage the risk.” For inspiration, Keswani suggests high-tech leaders look at more mature industries like automotive and aerospace & defense, which have a process orientation to analyze and manage change effectively. “High-tech needs to draw upon more from these industries and put in systems engineering disciplines to manage product complexity,” he said.

• Adopt rapid prototyping and accelerated testing using 3D computer-aided design (CAD) models to analyze fit, form and functional performance in the virtual stage, before moving to physical testing. “Simulation software has become a vital quality assurance tool in not just testing product design, such as drop testing, but also in ensuring that part suppliers are current on changes, and that the decisions they make mesh with the decisions of related parts and systems,” Keswani said.

Early planning is critical, experts agree. Planning includes coordinating downstream activities that are interdependent but must occur simultaneously, including engineering and operations, product development and testing, market analysis and regulatory compliance, manufacturing processes, logistics and supply chain requirements, packaging and distribution, marketing, sales and support training.


PRG’s Keer points to a tremendous explosion of resources, including sophisticated Product Lifecycle Management (PLM) systems, that make it easier for cross-functional teams to handle simultaneous activities by giving everyone real-time views of a product’s progress. “In the last five years there has been real improvement in the storage, control and communication of the product record, which now enables companies to overcome the historic fragmentation of information and achieve a strong NPI process,” Keer said.

Each Panasonic new product introduction, including the 4K tablet shown here, begins with a focus on improving consumer lifestyles. (Photo courtesy of Panasonic)

By helping to coordinate NPI activities that increasingly must run in parallel tracks, such systems also help companies hit ever-decreasing time-to-market deadlines, said Geoffrey Annesley, CTO of California (USA)-based Serus a supply chain management company.

For example, a great product design typically involves a team of design engineers and a raft of Intellectual Property (IP) management issues that, in the end, must be scaled to manufacture a high-quality product in massive quantities. Manufacturing, typically outsourced, requires its own specific design of the manufacturing process, with its own team of designers and its own IP. “Those two disparate processes need to run in parallel, and they need to feed back to both groups of designers,” Annesley said. “That often is a big mess. Outsourced manufacturing aggravates this.”

Kevin Leedy, director of QCA Supply Chain and Logistics at California (USA)- based fabless semiconductor giant Qualcomm, knows the challenge well. In the wake of its 2011, US$3.2 billion acquisition of Wi-Fi networking technology company Atheros Communications, Qualcomm experienced 30% year-on-year growth of its supply chain transactional activity. But all of its manufacturing was outsourced and 80% of its operational data was generated outside Qualcomm.

“Those companies focused on operational efficiency tend to be more successful. Those focused on marketing or technology tend to be weaker.”

Michael Keer
CEO, Product Realization Group

“I didn’t have full visibility,” Leedy said in a video webinar on the company’s website. “Clearly there was a disconnect. I was employing multiple full-time employees to respond to those day-to-day changes to resolve issues. It just was not the solution we needed to be fully scalable. It was impacting our growth.”

To automate its processes, Qualcomm implemented a cloud-based, virtual supply chain management tool that could be easily adapted to each supplier. The tool provides real-time, accurate data on material supplies, resource consumption, pay points and invoicing that supported the company’s revenue and transactional growth and freed up its human talent to manage process changes linked to the merger.


Regardless of how you characterize the challenge, however, the driving force behind every aspect of NPI is the interrelated need for speed to be the competition, plus great products that will delight consumers and keep them coming back to buy again and again. Consumer electronics manufacturer Panasonic is a master of both.

“I feel that speed is the biggest issue,” said Shigeo Okuda, head of business development and promotion for Panasonic’s 4K tablet, introduced at the 2013 Consumer Electronics Show (CES). “We of course need to reduce process times in our manufacturing operations, but our greatest challenge is how quickly we can read the changes in the market, reflect them in our development operations, and launch the right products in a timely fashion.”

Staying ahead of the competition is a challenge, because information becomes available so quickly, Okuda said. “Network developments mean that information is now transmitted instantaneously, the market changes rapidly and the digital era has progressed to the point that products of similar performance and fixed quality can be manufactured quite successfully. This has made it difficult to launch, and continue to launch, products that will be truly successful.”

As it develops new products, Panasonic therefore keeps its focus firmly on the customer’s wants and needs. “People form the central focus for everything we do,” Okuda said. “We look into their lifestyles and work hard to improve them. This represents the starting point as we aim to realize better lives for all our customers.”

Panasonic helps to ensure that its products meet customer needs through extensive testing in the digital stage, when changes are easy and affordable to make. “To achieve a greater degree of certainty as far as this success is concerned, we need to set and test hypotheses as to how our customers might use our new products in what kinds of scenarios, and how these products will bring value to the customers as they do so,” Okuda said. “If a customer response then suggests that our hypotheses have been slightly off the mark, we need to work quickly in making adjustments and modifications. If we can do so then we ought to be able to launch new, refined products that will be to our customers’ satisfaction.”

To achieve this goal, Panasonic has employed advanced PLM applications and processes. “Already, we have been able to significantly reduce both time and costs involved in refining our designs, thanks to the advances in digital simulation technology, which is replacing the succession of (physical) prototype models that we had to manufacture in the past,” Okuda said.


Companies that are organized to share ideas in a collaborative dialogue with suppliers, colleagues, and even customers, as Panasonic does, stimulate an open exchange of knowledge that tends to make them successful in their NPI processes, said Robert Handfield, a professor of supply chain management at North Carolina State University in Raleigh, North Carolina (USA) and consulting editor for the Journal of Operations Management. “Very few companies are good at this,” he said. “It takes time to develop relationships and trust.”

Handfield points to Honda as an example of a successful collaborator. The car maker generally gives its suppliers a target cost and a performance specification, not a design, and asks the supplier’s engineers and designers to find the best solution. “That sets off a dialogue,” Hanfield said. “Using that approach, the supplier can come to them with ideas. To get to that point, you have to build trust.”

Noel Sobelman, a partner at Kalypso, which helps life sciences and high-tech companies develop new products, said he has seen many of the NPI pitfalls to avoid in his 22 years of industry experience.

“One big problem occurs when companies treat process improve­ments as if they were the flavor of the month,” he said. “Successful continuous improvement programs require a culture where you continuously look to employees for improvement ideas.”

NPI processes demand their own metrics and depend on change agents in leadership positions with organiza­tional clout. “It’s very important to have people who are not just process policemen who throw a penalty flag when somebody does something wrong,” Sobelman said. “People who understand the link between capability improvement and sustainable business success are hard to find, as well as senior leadership that values their contribution and drive to get things done. Process improve­ment is not sexy, but doing it well separates the successful from the laggards.”


To succeed at NPI, the experts interviewed for this article recommend that companies cultivate:

• A culture of innovation
• A continuous flow of market and customer insights, forged through collaboration with partners, customers and teammates
• Leaders who can clearly communicate how to translate ideas into action with operational efficiency
• Tools and methodologies to accelerate and streamline work
• Early planning to coordinate interdependent, simultaneous downstream activities
• Automated workflows to provide clear status visibility
• Rapid prototyping and accelerated testing with 3D models and simulation
• Clear processes to balance risk with potential rewards
• Sufficient capital and resources to support realistic release schedules and adequate testing
• Phase-gate release processes to facilitate clear go/no-go decisions
• Metrics-driven continuous improvement processes

High-tech predictions

Shorter cycles will accelerate co-design, increase complexity

Dan Headrick

3 min read

John Blyler writes, teaches and speaks on technology, science and science fiction, and serves as chief content officer of Chip Design, Solid State Technology and Embedded Intel Solutions magazines. He is an affiliate professor of systems engineering at Portland State University, teaching graduate-level programs. Compass recently spoke to Blyler about trends in high tech.

COMPASS: The past decade has seen many milestones in hardware/software co-design. What do you think will stand out in the next decade?

JOHN BLYLER: Thanks to Moore’s Law and the efficiencies of engineering chips and boards, these things have become commodities. Companies have been forced to differentiate themselves with the software. Also, when you design a chip, you have to think about designing the board at the same time, so you get into the co-hardware/hardware design with software tying everything together.

That trend toward tighter integration is only going to accelerate. The time to get your product to market is shrinking, so you need to have software designed while the hardware is being designed. In many instances, the software demands at the user level are dictating what the chip design will be. Before, it was the other way around.

What are today’s biggest challenges in systems modeling, integration, and designing for the user?

JB: When I tell my engineering friends the movement is toward designing for the end user’s experience, they scratch their heads. It’s easy to see how that applies to software, because with software it’s easy to change on the fly. But for hardware, that’s trickier. How is that going to be implemented? That’s something the engineering community and manufacturing community are still wrestling with.

You see it in cell phones. The end-user input must come early in the design cycle as it will affect both the software and electrical-mechanical subsystems. Further, everything has to be low-power and green. You have a mountain of considerations, aside from just getting the product to work.

This is why we’re seeing a growing awareness of Product Lifecycle Management (PLM) issues across engineering disciplines. So many pieces have to come together at the same time, and PLM is a big help with that challenge.

The term ‘cyber-physical system’ (CPS) is often used to describe the coordination of a system’s computational and physical elements. How do you see this trend developing?

JB: The other term I’ve heard for this is intelligent embedded systems, and it is being enabled by hardware and software getting so small that you don’t need a standalone system. We’re living in a world where sensors will be everywhere, and the sensors themselves have intelligence, called sensor fusion. You have a computer, a sensor with its own little network, and a wireless system that connects to a cloud.

You can even extend that to biology. There’s definitely a move to link the genetic space and the electronic space, to see more electronics embedded in the human body. These systems can get pretty sci-fi-ish, where even cells will be used as transistors.

“In many instances, the software demands at the user level are dictating what the chip design will be. Before, it was the other way around.”

John Blyler
High-Tech Editor & Educator

What about the wireless chip market?

JB: If you look at carrier sales, sales of frequency bands, the big push toward 60 GHz, we’re going to see more and more wireless connectivity. So wireless is appearing in big ways and little ways. Little ways like sensors in your tires to monitor tire pressure, or little wireless sensors in a field of crops that will tell you when the plants need watering.

That leads to a discussion of how to power these sensors. Energy scavengers are devices that grab energy from the environment, such as the rotation of the tire. Or out in the crop field, you could use solar, or you could get energy from the pH (acidity/alkalinity) differences between the soil and the plant to power a little RF (radio frequency) circuit and send the information eventually back to the cloud.

It’s all very exciting, and yet high tech is desperate to attract workers. How can the industry motivate students to go into high-tech careers?

JB: I actually teach a class on science fiction and science, which I think helps to engage the imaginations of young people. But technology itself can also help to excite students and highlight the possibilities of high-tech careers. Inexpensive 3D printers, for example, would be a great way to get people interested in technology.

“We’re living in a world where sensors will be everywhere, and the sensors themselves have intelligence.”

John Blyler
High-Tech Editor & Educator

One factor that’s often ignored when talking about high-tech careers is people skills. We need for technically savvy folks to also have good people skills and solid emotional IQ. I think the online push and social media take away a little bit from interpersonal skills. In a global environment, we need interpersonal skills and cultural sensitivity. We can’t only focus on the technical side of high tech.

Social shopping

The digital future of retail

Jacqui Griffiths

4 min read

With the use of PCs, tablets and mobile phones becoming almost universal in developed economies, the rules of the retail game are rapidly changing. Customers expect the same rich experience no matter how they shop, with access that is available whenever and wherever they choose – and retailers are scrambling to deliver.

Retail is rapidly evolving to serve increasingly connected shoppers:

• 62% of consumers worldwide now go online for part of their shopping journey (Ernst & Young, 2011).
• 79% of UK 25-34 year olds who use the Internet before purchase are influenced by social media (Ernst & Young/YouGov, 2011).
• 40% of US cell phone owners use a social networking site on their phone (Pew Internet & American Life Project, 2012), creating the potential to quickly share both positive and negative shopping experiences.

US-based global advisory services and market research firm IDC Retail Insights has dubbed these connected consumers ‘5i’ Shoppers – instrumented, interconnected, informed, in place and immediate.

To deliver the experience these 5i shoppers demand, retailers need to understand how to engage with them across all the channels they use.


With consumers who are always connected, retailers must be on their best behavior. “It’s as if the customer is constantly peering in your shop window,” said Leslie Hand, research director at IDC Retail Insights. “Your brand, your reputation and your products are on display all the time, for all to see, everywhere.”

Failure to understand this new environment can be costly, as feedback on every shopping experience – good or bad – can circle the globe within minutes of it happening.

“There is real power in social media,” said Beverly Macy, author of The Power of Real-Time Social Media Marketing. “You can either harness that power and use it to your advantage as a brand, or it can potentially bite you, hurting your brand in the process.”


Retailers who understand how to engage with consumers across multiple channels – providing an ‘omnichannel’ experience – stand to benefit.

US fashion retailer Nordstrom, for example, combines social media with shopping apps, on-the-spot item location and register-free transactions. It hasn’t all been smooth sailing – the company recently abandoned an experiment in tracking smartphone signals in-store after protests from shoppers. But Nordstrom is making its omnichannel strategy work.


of consumers worldwide go online for part of their shopping journey. Ernst & Young

“Nordstrom realized years ago that it needed to start connecting the dots with technology because consumers were moving in that direction and it needed to be ahead of the curve,” Hand said. “It has made acquisitions around social media and daily deal sites, and increased the size and profitability of each basket.”

Philip Clarke, CEO of UK supermarket Tesco, told delegates at the World Retail Congress Asia Pacific 2013 in Singapore that the company’s future growth demands that Tesco “change the way we engage with our customers and embrace digital retailing.” That has been the case in South Korea, where the rollout of virtual stores in subways and bus shelters resulted in a 76% increase in registered users of the company’s mobile app and a 130% rise in revenues for Tesco subsidiary Homeplus. “In the future, app development is going to be just as important as property development,” Clarke said.


Understanding the big picture of omnichannel-customer behavior is a key challenge for retailers.

“A customer’s journey might start online with a social interaction with a friend, or inside the store where they might pick up their mobile to compare the product with what other stores have,” Hand said. “Every shopping event – whether or not it’s a buying event – is just as important as the others, because they all feed into the purchase decision.”

“There is real power in social media. You can either harness that power and use it to your advantage as a brand, or it can potentially bite you, hurting your brand in the process.”

Author of The power of Real-Time Social Media Marketing

Retailers need to be able to assimilate, analyze and respond to information from all the channels their customers use. Cases like Nordstrom emphasize the need for sensitivity to customers’ concerns about privacy and how that information is gathered. But if a retailer is successfully engaging with the consumer through digital and social media and in the store, a lot of that knowledge is already available.

“Collecting and understanding the information that people are freely giving about themselves on social platforms, as well as data from their interactions with apps, online and through mobile devices, is crucial,” Macy said. “Some companies are now using systems that collect this type of data and are turning it into intelligence that helps retailers.”


Information on digital and social media flows both ways, and retailers are realizing that 5i consumers expect a collaborative relationship with their brands. “Our Facebook page has helped us evolve our marketing from merely broadcasting to customers to a two-way dialogue with them,” said Lou Jones, head of online and digital marketing at British multi-national retailer Marks & Spencer.

Global fashion brand Benetton, meanwhile, is using 3D modeling, information intelligence and social and collaborative apps to capture consumer involvement alongside input from factories, suppliers and designers – all groups that include 5i  consumers. Extending collaboration in this way has enabled the company to develop new collections more quickly, saving time and money in responding to new trends and customer demands.

“Organizations must align their entire value chains to develop collaborative partnerships with consumers,” Ernst & Young said in its 2012 report, “This Time It’s Personal: From Consumer to Co-Creator.” According to the report: “Customer collaboration is particularly essential in innovation.”

Rising to the challenges of digital and social media will be an ongoing learning process as new channels continue to emerge. But forward-thinking retailers like Tesco, Nordstrom and Benetton are gaining the understanding they need to reach the 5i consumer.

“It’s all about being more responsive and communicative,” Hand said. “How you extend your presence and create more personalized, virtual marketing that engages the consumer – that’s what will keep you on top of the shopper’s mind.”

Bricks & mortar

Creating stores that consumers want to visit

Jacqui Griffiths

4 min read

As the e-commerce experience continues to improve, it is tougher than ever to attract shoppers to physical stores – and those who do visit the store often buy online. To make their physical sites work, retailers are struggling to redefine their stores as appealing destinations not only for test-driving a product, but also for buying it.

Mobile technology holds challenges and opportunities for retailers. Professional services company Deloitte found that 75% of consumers research products both online and in-store before purchasing, while the annual ”Mobile Life” study conducted by TNS, a London-based global market research company, suggested that one third of consumers practice ‘showrooming’ – browsing in stores before purchasing from a competitor online.

Showrooming can be fatal to traditional bricks-and-mortar stores. The trick to combat it is for such stores to make their engagement with shoppers so rich that the customer will want to purchase their product there and then.

“Providing the ‘In-Store Experience’ is vital for physical stores to win over foot traffic and brand loyalty,” Shelley E. Kohan, vice president of retail consulting at in-store analytics company RetailNext (California, USA), wrote in a recent column for Chain Store Age. “Shoppers want to explore, learn and have fun. Above all else, they want that instant gratification from products that are in stock and easy to find.”

Retailers are taking Kohan’s message to heart. For example, UK fashion retailer Burberry provides an immersive brand experience at its London flagship store, where a gallery featuring the brand’s history sits alongside digital screens, speakers, a hydraulic stage and iPads. “It’s not just about shopping,” Christopher Bailey, Burberry’s chief creative officer told The Business of Fashion website. “The important thing is that when you go in, you feel entertained.”


The analytics capabilities inherent in online retail are now becoming a reality for the physical store as well. QR codes, RFID tags, shopping apps and video clips delivered in-store can provide vital information about how consumers interact with the store so retailers can make intelligent use of the space they have.

The growth in bricks-and-mortar stores’ use of such technologies is consistent with analyst firm IDC Retail Insights’ Top 10 predictions for 2013, which forecast that retailers will “pivot their merchandising and marketing on customer analytics” by investing in “customer analytics, merchandising and marketing technologies” as well as “technologies that enable visibility, visualization and virtualization.” Using these sophisticated platforms, retailers can pull together information such as in-store analytics, consumer group feedback and brand rules, and share it directly with planning teams to enable informed decisions about store layouts and product assortments.

“The goal is to have in-store shopping metrics that mirror online capabilities: the who-what-when-where-how of customers,” Kohan said. “It is not enough to have the data; retailers must use the data in ways to grow shareholder value. Applying big-data metrics in the physical store results in targeted, personalized marketing and predictive in-store shopping behavior.”


The store itself isn’t everything; customer service is a vital part of the in-store shopping experience that online retailers simply can’t match. IDC Retail Insights found that as many as 60% of consumers were more likely to buy in-store when assisted by trustworthy, knowledgeable store associates – but they want mobile content too. Some 64% said that what they can learn using their smartphone in-store would influence their decisions as much as what they learned during in-depth online research at home.

“Rather than seeing mobile as a threat to in-store sales, retailers must embrace it as the most immediate and personalized way to engage shoppers to ensure they don’t leave empty-handed,” said Matthew Froggatt, chief development officer at TNS.

In April 2013, UK-based Marks & Spencer opened its new concept store in Holland featuring digital innovations. Within the store, customers can buy food but also explore the full range of clothing products on tabletops and digital screens. (Photo courtesy of Marks & Spencer)

That personalized service can begin even before a shopper enters the store, subject to their privacy preferences. “Some stores are backing up geo-location technology with analytics and apps so that when a customer is nearby they receive a message that’s relevant to them,” said Greg Girard, program director of Merchandise Strategies at IDC Retail Insights. “If a customer abandoned a dress in her online shopping cart and is now outside the store, she’s more likely to have questions about the dress than concerns about the price. So you can pop up a message to say someone will be there to greet her and answer those questions. In the process, you shift away from discounting and provide the service she really wants.”

In the store, QR codes and RFID tags can provide relevant, interactive information about products through shoppers’ smartphones or through the store’s mirrors and digital signage. Fashion retailers Burberry, Guess and Benetton are going a step further, providing tablet devices to give shop- floor staff an overview of inventory and customer preferences so they can provide a more intuitive, personalized service.

Shopping apps also are proving popular. For example, US-based discounter Walmart offers an app that helps consumers plan their visit and connect to their online and physical store experiences. “We can greet each of our customers by name, guide them through our stores, and give them product recommendations and real-time savings — all from their mobile devices,” said Gibu Thomas, Walmart’s senior vice president of mobile and digital.


This type of connectivity – supported by network-wide visibility of inventory and fast delivery options for out-of-stock items – is helping some retailers turn showrooming to their advantage.

“It’s not just about shopping. The important thing is that when you go in, you feel entertained.”

Christopher Bailey
Chief Creative Officer, Burberry

For example, UK electronics retailer Dixons has retrained its staff and removed the personal sales targets that pitted its stores against the company’s own online offer. It now provides store-wide incentive programs linked to customer satisfaction, whether the sale is transacted online or in-store. “We’ve dramatically improved our online search and we’ll be crediting stores with online sales in their catchment area,” Dixons CEO Sebastian James told Retail Week. “It’s good practice, but surprisingly few retailers are doing it.”

“The key is to find ways to make buying in-store the convenient option,” TNS’s Froggatt sums up. “Anything that saves the consumer time, money or angst will help reduce the flow of people out of the shop to purchase elsewhere.”

While e-commerce brings challenges to traditional retailers, it also presents an opportunity to harness information and integrate customers’ technology preferences with the physical space of the store. Retailers who take that opportunity are delivering the anytime, anywhere, any way stores that today’s consumers want.

The future of construction

Balfour Beatty executive predicts bright promise for 3D technology

Nick Lerner

3 min read

Balfour Beatty, which employs about 50,000 people worldwide, is recognized as a world-class infrastructure lifecycle services business. Compass spoke with Dr. Chris Millard, business efficiency director at Balfour Beatty’s UK Construction Division, about integrated project delivery and collaborative construction.

COMPASS: Can you give us a brief picture of Balfour Beatty and your position within it?

CHRIS MILLARD: Innovation, sustainability and delivery capability are key themes that I address at Balfour Beatty. I work across the company with teams drawn from all areas of expertise within and beyond our enterprise. We work on transport, power, social infrastructure for education, health and sport, utilities, accommodation and leisure, and retail assets. I introduce management and leadership principles and practices that improve efficiency and cut project waste by up to 60%.

Can you outline the construction industry’s major challenges?

CM: A flat economy and reductions in infrastructure spending mean that we must deliver more value to stakeholders. We can achieve this when a focus on sustainability also delivers and maintains recognizable value for customers. Cutting waste has financial and ecological benefits. Consistently driving business efficiency with integrated, systematic improvements is a best-practice prerequisite. It is important to break down information silos and integrate teams to successfully and economically deliver excellent projects.

How do these improvements affect stakeholders in Public Private Partnerships (PPP) and Private Finance Initiatives (PFI)?

CM: One of the key requirements of PPP and PFI involvement is to optimize and demonstrate value across project lifecycles. Constructively engaging stakeholders with 3D simulation and visualization technology early in the project, so that everyone sees and agrees on the end result, allows us to more accurately understand and innovatively meet their needs. Modeling and simulating projects and budgets through their lifecycles provides a basis for this transparency. Virtual Reality also boosts stakeholder confidence by empowering us to establish and enhance the performance of buildings while minimizing operation and maintenance costs.

Dr. Chris Millard, business efficiency director at Balfour Beatty’s UK Construction Division

How do integrated project delivery and collaborative construction benefit the construction industry, when compared to traditional processes?

CM: Traditionally, the construction industry has been fragmented, with sequential processes: brief, estimate, tender, re-design, re-tender, create work packages, build, etc. Many companies still operate this way, and it is clearly wasteful. We gain real value by integrating project delivery and supply chain partners early in the design phase. Multi-disciplined teams jointly plan and iterate to define better solutions.


Balfour Beatty’s collaborative construction practices cut project waste by up to 60%, carbon dioxide by 42%, and direct water use by 54% (2010-12).

Can you cite an example?

CM: The 2012 Olympics’ Aquatic Centre in London brought together multifunctional teams of structural, mechanical, architectural, civil services and ground engineers on a highly innovative project with many technical and operational challenges.

The complexities included sinking pilings nine meters below the water table on a constrained site and the engineering and construction of the high, unsupported, dual-curvature roof. The beauty, functionality and legacy of the Aquatic Centre are testaments to the benefits of collaborative construction within an integrated project delivery framework.

How much integrated project delivery and collaborative construction happens in the industry today?

CM: ‘Not enough,’ is the blunt answer. However, informed sectors of the industry recognize and capitalize on process improvement. When the economy picks up, those with appropriate processes will be prepared, ready and able to maximize their advantage, while others become unsustainable.

What must companies do to achieve these benefits?

CM: Government strategies in some countries offer support to companies that want to adopt integrated project delivery programs. Resources are available from leading industry players and potential partners. Business transformation partners are popularizing collaborative ways of working and helping to liberate value through technology. A critical mass of understanding and commitment among senior business leaders throughout the industry has also been reached.

How do integrated project delivery and collaborative project handling change people’s experiences?

CM: These approaches help people experience transformative levels of understanding. The ability to fully comprehend stakeholder requirements leads to more innovative, timely and cost-effective solutions. It is important on a project to give stakeholders universal access to 3D data in meaningful ways.
3D simulation changes perceptions and lets people make better and more fully informed decisions.

How do owners’ lifecycle cost considerations fit with this?

CM: Engaging owners around fully featured 3D models and realistically virtualizing future assets encourages rich, early dialogue. In complex facilities, 3D models enable owners to plan for maintenance without shutdowns. Owners place a high value on budget transparency and easy access to information. With 3D, these advantages are built into projects from their inception.

What is your vision for the construction industry?

CM: Working to, and executing, a plan based on a unified and collaborative experience platform offers huge benefits. We need sustainable, adaptable, reconfigurable buildings and infrastructure assets. 3D, digital modeling, process change, and using technology for integration and collaboration all help to achieve this goal and positively contribute to society’s well-being. I would like to see the construction industry accelerate its uptake.

Breaking with tradition

Why the construction industry needs an industrial revolution

Nick Lerner

4 min read

Until recently, the construction industry has suffered a technology bypass, relying on centuries-old processes and procedures to manage complex modern projects. Today, however, the same software applications that make manufacturing industries efficient are being deployed in building construction. Compass spoke with leading construction industry consultant Dr. Perry Daneshgari about why the industry must evolve.

In 50 years of the most accelerated technological advances, a period in which industry after industry has used technology to improve efficiency, the art of building has  lagged. Studies by the National Institute of Standards and Technology (SITC), as well as Tulacz and Armistead, have documented 25% to 50% waste in coordinating labor and in managing, moving, and installing materials. In many cases talent and skill are underused, avoidable accidents happen and productivity remains low.

With high projections for growth – PriceWaterhouseCoopers (PwC) predicts that by the end of this decade, construction will account for more than 13% of the global economy – the time is ripe for change.

Dr. Perry Daneshgari, a construction industry consultant famous for championing the adoption of process and product control, believes that the construction industry needs to look to economist Adam Smith for inspiration.

“Adam Smith’s ideas of segregating tasks and the division of labor have been implemented in manufacturing over the past 250 years to produce an abundance of goods at affordable prices,” Daneshgari said. “Productivity has been accelerated through the application of statistical process control (SPC) that helps to generate supply and to satisfy demand.”


Process models for construction have remained largely the same for hundreds of years, with highly skilled labor carrying out tasks for which they are overqualified 80% of the time. “Making components in a factory enables manufacturing by lower-skilled operators,” Daneshgari said. “This cuts cost, improves quality, reduces onsite re-work and allows total operational control. In this system, work onsite consists of assembly of quality-assured parts, each guaranteed to be fit for purpose.”

Applying SPC to the automotive industry allowed cars to become better and less expensive. The lack of the same methodology in construction has contributed to waste and to soaring prices, Daneshgari said, but he believes that dynamic is about to change.

“New types of companies are looking at construction as a huge opportunity. For example, an air-conditioning contractor ( has branched out and started to industrially produce and erect large buildings in just six days. In China, the world’s tallest building, Sky City, will add five stories per day to be built in 120 days. We are also seeing contractors join into larger groups. They are changing building from a cyclical, low-tech, physically exhausting and unsafe industry to one that is attracting new, innovative talent.”

About the benefits of industrialization, “it has been proven that you can build at twice the speed using half the resources,” Daneshgari said. “Technologies are slow to reach construction, but when they do the impact and rewards are significant and worthwhile.”


Stacking problems occur when small errors in each building component multiply over multiple floors. This leads to electrical, power and other services, such as heating and ventilating, no longer fitting the structure. “That usually requires the deployment of highly skilled workers on site, remodeling concrete with rock drills or making expensive and potentially problematic changes to mechanical services at the point of fitting,” Daneshgari said. “Statistical variation techniques, which have been standard practice in the auto industry for 60 years, would solve the problem.

“Unfortunately, the construction industry’s investment in research and development is among the lowest of any major industry,” Daneshgari said. “But when you start to innovate with technology to drive the use of standardized products and modularized processes, productivity gains are spectacular. 3D has made significant inroads into architectural design and fabrication, but process modeling is virtually non-existent.”

To increase efficiency, eliminate waste, and increase profit margins, companies in the construction industry, as well as governments, must invest in R&D, Daneshgari said. ”Those that have invested achieve cost reductions and quality improvements that let their companies win contract after contract,” Daneshgari said. “New types of companies are building shopping malls at 40% lower cost. Zero-error buildings are being made where the reduction of re-work is producing bigger profits for those involved.”


Attracting outside talent to newly structured construction enterprises gives them a much needed and valuable intellectual boost. “The construction industry is being changed by a small number of very clever and enterprising people who are transforming the industry by taking new approaches,” he said. “Unshackled by tradition and equipped with portable, powerful and robust technology, they are bringing the business into the 21st century.”

Construction industry players who don’t step up to the challenge of modernizing their processes will hold the entire industry back, Daneshgari believes. He cites W. Edwards Deming, who made significant contributions to industrial science and practice. Deming proved that individual performance could only be improved by “elevating the entire system,” Daneshgari said. “Using ideas from Newton, Adam Smith and, in the 20th century, Deming, industry in general has been able to progress to its current advanced state. Generally, the construction industry did not follow the same enlightened path. Unless prompt action is undertaken, it will be surprised by new companies that efficiently capitalize the massive economic opportunities that present themselves in this rapidly growing industry sector.”

Dr. Perry Daneshgari, a widely published consultant to construction and other industries, holds a Ph.D. in Mechanical Engineering from University of Karlsruhe, Germany, and an MBA from Wayne State University (USA). Through his company, MCA Inc., he advises businesses around the world on process and product development, waste reduction, labor productivity improvement, project management, estimation, accounting and customer care. (Photo by David Lamarand)


Dr. Perry Daneshgari, a widely published consultant to construction and other industries, holds a Ph.D. in Mechanical Engineering from University of Karlsruhe, Germany, and an MBA from Wayne State University (USA). Through his company, MCA Inc., he advises businesses around the world on process and product development, waste reduction, labor productivity improvement, project management, estimation, accounting and customer care. (Photo by David Lamarand)

Best of the best

World’s top schools individualize learning

Christian Füller

6 min read

Finnish, Korean and Canadian schools consistently lead international comparisons of top-performing educational systems. The common denominator? High regard for the teaching profession and individualized learning tailored to each student.

It’s just after 8 a.m. at the Olari High School in Espoo near Helsinki, Finland. The pop music of Robin, the Finnish version of Justin Bieber, rings out over the public address system. Everyone in the school starts tapping their feet. Have the students taken over the school?

“No,” headmistress Kaisa Tikka says, then smiles. “We want to start school each morning with something that is fun. This is why the students determine which music will start the day and the learning.”


The Olari High School is not an alternative school, but it is one of the best schools in Finland. For example, Olari’s students have repeatedly won the Scandinavian Mathematical Olympiad. Despite its track-record of excellence, however, Olari is an ordinary Finnish school. What makes it special among schools globally is its focus on putting students at the center of everything.

When comparing schools in Finland, Canada and Korea, countries that occupy the top places in the global school comparison study “Program for International Students Assessment” (PISA) year after year, a pattern of putting the student at the center is the first obvious commonality. The way each system places students at the center of the curriculum varies based on culture and continent:

• Tiny Finland has rebuilt its schools over the past 40 years into a model of academic success, a transformation known as the “school miracle.”
• In Korea, the Confucian culture of learning leads with discipline and rigor that also allows for significant levelsof individual, personalized learning.
• Canada, an industrialized Western country, permits each of its widely varied provinces to adopt its own school model; Canada doesn’t even have a national ministry of education.


Andreas Schleicher, better known as “Mister PISA” because he invented the worldwide school comparison, sees another main similarity among top-performing schools: their teachers.

“Finland, Korea and Canada have made teaching an attractive career choice that invites the best candidates,” said Schleicher, who leads the Education Department of the Organization for Economic Cooperation and Development (OECD), a global alliance supported by 40 nations. ”They support their teachers to make innovations in the art of teaching, to improve their own performance and that of their colleagues, and to pursue professional development that leads to stronger educational practice. Their teachers work with professional autonomy, but in a collaborative culture.”

“Finland, Korea and Canada have made teaching an attractive career choice that invites the best candidates.”

Andreas Schleicher
Head of the Education Department of the Organization for Economic Cooperation and Development (OECD)

Schleicher surveyed teachers worldwide with a focus on how they are trained. The result: Finland, Korea and Canada are all nations where teachers enjoy a high reputation in society. In Finland, for example, only the best high school graduates are admitted to teacher education – and the best students want to be teachers because the profession is so attractive.

“Finland’s elite teach,” said Jukka Sarjala, former head of Finland’s Central Education Office and one of the architects of the Finnish “school miracle.” But the actual secret to a good school, Sarjala said, is not selectivity, but integration. “All children aged 7-16 go into the same school. In each municipality, ranging from the rich south to the poor north, there is only this type of school.” The pattern is the same in Korea and Canada, which also are charac­terized by non-selective school systems.

“It is the different style, the different education, which makes the integrative school,” Sarjala said. “The teacher is responsible for 25 different children, and must develop 25 different educational concepts.”


The result cannot be read in studies, but can be observed first-hand in the classroom. Consider the teaching of English at Olari High School. At first glance, the approach to teaching appears no different from the traditional lecture method. At the front of the room, the teacher has written “Charlie Chaplin” on the schedule. But this teacher doesn’t lecture. Instead, his eighth-grade students analyze the master movie director for themselves, using English-language resources and presenting their findings in English.

In Finland, “the teacher is responsible for 25 different children, and must develop 25 different educational concepts.”

Former Head of Finland’s Central Education Office

Each small group chooses from a list of topics: “The Great Dictator” (one of Chaplin’s most famous films); Chaplin’s life; Chaplin’s contributions to film. Then, each group of students researches its topic online. Later, the students present their findings to the class while the teacher films the group. Afterward, the teacher discusses the presentation with the students, giving feedback on their use of English. The advantage of this “no more lecturing” style is that students are able to explore Chaplin for themselves. This approach ensures that every student is involved, using full sentences to express their ideas on a topic of genuine interest.


The Korean education system is stereotyped as being purely focused on repetitive drills. A closer look, however, reveals that in Korea, as in Finland, individual tasks allow students to learn independently.

“In the past, students could prepare for university entrance exams with ‘passive’ learning” characterized by lecture and memorization", explained Inn-Woo Park, a professor of pedagogy (the art and science of education) at Korea University. “However, things have changed and passive learning is no longer a good way to prepare for the exams.”

Years ago, students in Korean classes sat side by side or in neat rows, listening quietly to their teachers. Today, the situation is far different. Each class begins with students receiving an introduction to the lesson from a teacher at the front of the classroom. Then, however, they split into working groups, studying independently with computers.

“In the past, a strong student was one who was able to remember what he or she had learned,” Korean elementary teacher Cha-Mi Kwon says in an OECD documentary on Korea’s educational transformation. “Nowadays, they need to be able to select what is useful from various sources of information and re-create it as their own.”


Video studies from Canadian schools, meanwhile, show a similar picture. Teachers rarely stand in front of formal learning groups and speak. Instead, Canadian schools prefer group and independent work tailored to the needs of individual students, either through tutoring or special training. If one group gets ahead of the others, it can continue to progress at its own pace.

At Unionville High School in Markham, Canada, each student is assigned a Student Success Teacher, provided by the Province of Ontario. “I am thrilled by what the term stands for, which is supporting every student and ensuring that each one has multiple opportunities to be successful,” said Susan Logue, the school’s rector.

Ontario is one of Canada’s most successful provinces in the field of individualized learning, in part because one-quarter of Ontario’s students were born outside of Canada. “We need to deal with that diversity,” said Mary Jean Gallagher, assistant deputy minister of education for Ontario. “One of the ways we have done that is by ensuring that we provide more support and more resources in meaningful ways to schools that are more challenged.“


The educational systems of Finland, Korea and Canada also are in constant change, embracing new technology and student individuality to maximize success. In all three countries, leveraging computers and the Internet in digital learning environments enables three distinct advantages: helping students learn traditional subjects in non-traditional ways; preparing students to use key technologies of the 21st century; and facilitating individualized learning.

As with individualized learning, the three countries’ approaches to digital learning are as different as their cultures.

Korea’s national strategy of digitization began in 2007, when the concept of technological development was adjusted so that each Korean student uses only digital textbooks. Starting in 2015, each Korean student will receive a tablet computer at no cost to the student.

In contrast, decentralized Canada has no national plan for digitized learning. Each province adopts its own approach.

In Finland, meanwhile, each school has autonomy. For example, following a history field trip, a student might summarize the experience by assembling an e-book on a tablet from texts, films, and photos available online. An entire class at Olari School received a set of tablets for their “time travel” excursion to a period in Finland that occurred long before recorded history: the Ice Age. With the devices, the students documented excavations, interviewed experts, wrote their texts and built their graphics. ”We started with iPads because we wanted to have something very new,” Olari School headmistress Tikka said. “The learning process is more in groups, not individual. With iPads, students are sharing more things.”

In Espoo, no national support is needed because the headmistress has a budget she can spend at her discretion. “I knew the benefits of tablets and bought a set of equipment for my school,” Tikka explained. “A project group of teachers tested them.” As of 2013, every student at the Olari school receives a tablet in the 11th grade.


So what can the rest of the world learn from the examples of Finland, Korea and Canada? Perhaps the best lesson is what not to do.

Pasi Sahlberg of Finland, who teaches at Harvard University, is one of the world’s most sought-after educational advisors and school reformers. He observes: “There are interesting factors that you can’t find in any of these high-performing systems – government- driven privatization of schools, reliance on standardized testing, putting schools in competition against one another, or confrontation between authorities and teachers.”  ◆

  • Christian Füller is a German journalist and author who has written about failing school systems and teaching for the 21st century. He has studied international school systems, including those in the Netherlands, the Scandinavian countries, Poland, Vietnam, and the USA.

Gender gap

Tired of waiting for firms to promote women, many governments consider quotas

Rebecca Lambert

5 min read

Developing and advancing talented women promises significant economic benefits. Studies show that companies with a higher proportion of women on their management committees are the best performers. But top circles of corporate and government leadership remain male bastions, and governments across Europe are adopting legislation to force equality.

Women account for half of the potential global talent base but, according to the World Economic Forum’s “Global Gender Gap Report 2012,” significant gaps in senior positions, wages and leadership levels still persist in corporations and governments in all countries.

Those gaps remain despite evidence that indicates organizations limit their own success when they fail to take full advantage of their female employees’ knowledge and skills. “Research shows that practices which aim to close gender gaps have a positive influence on bottom line, innovation and team success,” said Saadia Zahidi, an author of the report who serves as senior director of Gender Parity and Human Capital at the World Economic Forum (WEF). “Pursuing gender diversity also leads to access to a growing, well-educated segment of the workforce.”

“Women Matter,” an annual research project started in 2007 by global management consulting firm McKinsey, supports Zahidi’s observation, providing evidence that companies with the highest proportion of women in senior management positions achieve enhanced organizational and financial performance. One study showed that the 89 publicly traded European companies with the highest levels of gender diversity in top management posts, on average, outperformed their sectors in return on equity (11.4% versus an average 10.3%), operating results (earnings before interest of 11.1% versus 5.8%) and stock price growth (64% versus 47%) between 2005 and 2007.

“Today, women account for more than half of university graduates globally; in the Organisation for Economic Cooperation and Development member countries, they will soon account for around 70% of those coming out of higher education,” said Pedro Taborda, deputy director of Caixa Geral de Depósitos in Portugal, who has previously served as a human resources manager for numerous Portuguese companies. “To succeed, companies must take advantage of the talents of the entire workforce.”


For Taborda, the gender gap is particularly perplexing because women control most consumer spending. “In Europe, women are the driving force behind more than 70% of household purchases, even though they account for only 51% of the population,” he said. “Having companies with leaders who better match their customers’ demographics makes good business sense.”

However, companies and governments are a long way from equally leveraging the talents of their female employees. In its 2013 annual survey “Women on Boards,” global research and ratings provider GMI Ratings found that women hold just 11% of board seats at the world’s largest companies. This marks an increase of 0.5 percentage points since December 2011 and only 1.7 percentage points since 2009.

The survey also highlighted the significant variations by region. While Norway, Sweden and Finland continue to lead the developed world in their percentage of female directors, with 36.1%, 27%, and 26.8%, respectively, Canada’s progress is stagnant at 13.1%, unchanged from 2012. Representation of women on boards remains poorest in Japan, where 1.1% of directors are female.   

“Each company has to identify practices that will suit best their gender gaps, their company culture and national policies.”

Saadia Zahidi
Senior director of Gender Parity and Human Capital at the World Economic Forum

According to the 2012 McKinsey report “Women Matter: An Asian Perspective,” women on average account for just 6% of seats on corporate boards and 8% of those on executive committees across Asia. The UN Economic and Social Commission for Asia and the Pacific Countries reported in April 2013 that restricting job opportunities for women is costing the region an estimated US$89 billion every year.

“The blunt truth is that men still run the world,” Facebook COO Sheryl Sandberg wrote in her book Lean In. “While women continue to outpace men in educational achievement, we have ceased making real progress at the top of any industry. This means that when it comes to making the decisions that most affect our world, women’s voices are not equally heard.”


Legislators in several countries have grown tired of waiting for corporations to become more gender-balanced – despite marked gender inequality in their own ranks. Belgium, France, Iceland, Israel, Italy, Malaysia, the Netherlands, Norway and Spain have all introduced legal instruments to promote gender equality on company boards. The European Commission, where 54.5% of staff members are women, has also proposed a law that would require company boards to have 40% female membership by 2020.

Not everyone thinks such quotas are a good idea. “On the surface, a quota may simulate fairness, but it tends to exacerbate unfairness for individuals rather than eliminate it,” German Family Minister Kristina Schröder, who resigned to spend more time with her children following the 2013 elections, told Spiegel Online in March 2012. “I think it’s absurd to impose a uniform quota on very different companies, from the steel industry to the media.” Women with family responsibilities also tend to reject the 80-hour workweeks expected of senior executives, Schröder said, making them more difficult to promote.

Astrid Lulling, a member of European Parliament (MEP) from Luxembourg, agrees. “Why don’t we have women governing the Central Bank in many countries? Because women, who are now at the age to enter into these positions, didn’t study mathematics, finance, business administration or economics,” she told German broadcaster Deutsche Welle.

Proponents are not dissuaded. “I don’t like quotas, but I like what they do,” said Viviane Reding, the European commissioner responsible for the EU proposal. Likewise, Easyjet CEO Carolyn McCall told Management Today magazine: “I’m against quotas because of their tremendous potential to cause a backlash and set things back further, but there has to be a better balance on boards. It’s the pipeline of female executive talent that we should focus on.”


BAE Systems and Lockheed Martin – which operate in the traditionally male-dominated aerospace and defense industries – are just two examples of top companies that have appointed female chief executives to lead their businesses.

“We’ve been investing in talent and developing our talent for many years,” Marillyn Hewson, who took the top position at Lockheed Martin at the start of 2013, said at the company’s Ninth Annual Women’s Leadership Forum in November 2012. “It’s no surprise to me that we’re seeing more and more women leaders moving into senior roles. They have competed, and they have gotten results. I’m really proud of Lockheed Martin and what we’ve done to build our talent pipeline. We have invested in the growth of our employees. We’ve worked hard to help them get ready for the next job.”


Women currently hold 4% of Fortune 500 CEO roles. Fortune Magazine

To groom female employees for top positions, however, companies first must develop female-friendly policies and cultures. “Family leave, childcare assistance and gender-friendly taxation systems are all policies that can play a key role in influencing the magnitude and scope of the economic participation gender gap,” Zahidi said. “Additionally, companies can put in place practices such as measurement and target-setting, mentorship and training programs, awareness, incentives and accountability, and work-life balance, among others, as these can have an impact in closing the economic gender gap.”

Such programs are a strong focus at BAE, said Donna Halkyard, head of Diversity and Inclusion at global defense, aerospace and security company BAE Systems Inc., a US-based, wholly owned subsidiary of London-based BAE Systems plc. “We run school roadshows to encourage young girls to pursue science, technology, engineering and mathematics (STEM) related careers. At the moment, the (UK) national average for female STEM apprentices is 5%; we take in more than double that.”

Halkyard notes that Linda Hudson, president and CEO of BAE Systems Inc., sits on the company’s global executive committee. “Linda has demonstrated a real commitment to meeting our gender-balance goals and inspiring women to succeed,” Halkyard said. “She has personally established a diversity council composed of senior leaders within our US business to actively engage with women at all levels of the business and help them reach their full potential.”

BAE Systems also has established employee networks, including a global women’s virtual forum, where women across all levels of the business can share their personal experiences, mentor one another and highlight their success. “Currently, 25% of our board of directors is made up of females,” Halkyard said. “Our chairman has made a public commitment to at least maintain that level. We also have a company target to ensure that at least 25% of our people at executive committee level are women. In June 2012, we achieved that.”

Like BAE Systems, companies benefit most when they implement gender diversity in ways that complement and enhance their specific corporate cultures. “Ultimately,” Zahidi said, “each company has to identify practices that will suit best their gender gaps, their company culture and national policies.” ◆

Culture in question

Protecting diversity in a globalized world requires balance

Jacqui Griffiths

6 min read

Around the world, communities are working to preserve their local culture against the threat of homogenization. However, globalization and digital communication – the same forces that are driving standardization – could also help to create an evolving, multicultural experience.  

Stories, place names, art, television, literature and music are just a few of the important elements that contribute to a community’s history and to its present and future cultural identity. But those elements are not set in stone. Threats to local culture – from the vanishing Gaelic languages of Scotland and Ireland to the global chain stores that have made the Champs Élysées in Paris, in the words of New York Times journalist Steve Erlanger, “far less French,” have prompted outcry among local communities.

For many, it is increasingly important to protect local culture from the homogenizing effects of fast-moving, digitally enabled globalization. But it also is important, experts say, to acknowledge the evolutionary nature of culture within this increasingly globalized world.

“There is always room to do something to protect culture,” said Cecile Duvelle, chief of the Intangible Heritage Section at the United Nations Educational, Scientific and Cultural Organization (UNESCO). “What is impossible is freezing a culture to keep it intact from any evolution or change. Cultures are living bodies that have changed drastically over the years, so we shouldn’t be afraid of globalization. While it is having an impact on all cultures, it is also giving all sorts of opportunities for all cultures to evolve and produce new expressions.”

Those new cultural expressions can come from a wealth of sources, not least the digital media that allow global communication, interaction with other cultures and, in the case of films, television programs or music, global distribution. Still, resistance to such evolution is a long-standing, often generational phenomenon.

“We have a tendency, especially as we get older, to regret that some cultural expressions are disappearing and that the new cultures are more homogenized, but this is often because we are not ready to recognize the new forms of culture,” Duvelle said. “While it can’t be denied that there is a disappearance of culture and cultural diversity, there is also a wealth of new cultural expressions created by this.”

Speaking the language

Language and dialect are key focal points in many attempts to preserve cultural heritage. “When we talk about preserving dialects, we are really talking about galvanizing communities themselves,” said Dr. William Lamb, lecturer in Scottish Ethnology at the University of Edinburgh in Scotland. “Local languages and dialects are a crucial part of our collective linguistic and cultural history.”

Efforts such as subsidized bilingual business signs, local-language TV channels and education can help to raise the perceived status of a language. Indeed, these measures are meeting with some success for the Gaelic language in Scotland. But can languages and dialects realistically remain unchanged as the world of communication evolves? “Indirectly, through increasing awareness and education about the language, people may use it more and ensure its continuation in an inter-generational sense,” Lamb said. “But ultimately, the way that we speak and write changes incrementally over time, and we know that it is very difficult to engineer control over linguistic behavior.”

“Local languages and dialects are a crucial part of our collective linguistic and cultural history.”

Dr. William Lamb
Lecturer in Scottish Ethnology at the University of Edinburgh in Scotland

Lamb believes that a new dialect of blended pronunciations is likely to emerge in Scotland as children hear a variety of Gaelic dialects from their teachers, and that while the loss of local dialects is to be lamented, this evolution of language is better than losing it entirely. “As in many things, there is both ‘good’ and ‘bad’ when it comes to linguistic homogenization,” Lamb said. “If children are proud of the language and speak it to one another and, subsequently, to their own children, then there is a future for the language. That is the yardstick by which we need to evaluate any inter­ventions and observed phenomena to do with Gaelic, or any other minority language.”

Some countries, however, fiercely defend the purity of their native tongue. Consider France, where the French government recently rejected the social media term “hashtag” in favor of “mot-dièse.”

Still, linguist Nicholas Ostler, chairman of the Foundation for Endangered Languages, suggests that the Internet may actually help to slow the adoption of English as the global lingua franca, the default language used to communicate between people who don’t share a native tongue. Why? Because online translation tools are becoming faster, more powerful, more accurate and, increasingly, they’re free. “The main story of growth in the Internet is of linguistic diversity, not concentration,” Ostler wrote in his book The Last Lingua Franca: English until the return of Babel. Perfect instantaneous translation is some way off. Still, Ostler’s view of the linguistic future points to the Internet as an enabler of diversity rather than a homogenizing influence.


Perhaps it should come as no surprise then that efforts to preserve local culture are designed to work within a global context. Many governments, from Australia to Venezuela, have put quotas in place to ensure that a percentage of the material broadcast on radio and television is in the native language, or represents the local culture.

In Canada, for example, a percentage of radio and television broadcasts, including cable and satellite channels, must include content that is at least in part written, produced or presented by Canadians. According to the Canadian Radio-Television and Telecommunications Commission, both French-language and English-language radio stations also must ensure that at least 35% of the popular music broadcast each week is Canadian content.

Such measures are not aimed at denying globalization, but are a way of ensuring a place for local cultures in a globalized world, said Charles Vallerand, executive director of Canada’s Coalition for Cultural Diversity. “It’s not so much a way to close borders or reject foreign content,” Vallerand said. “It’s more a question of protecting your own cultural assets. That’s what Canada and France have been doing with quota systems or with subsidy measures with public broadcasting. We have a lot of foreign content on our broadcast media, so the issue becomes one of creating shelf space for your own content.”

In 2005, UNESCO adopted the Convention on the Protection and Promotion of the Diversity of Cultural Expressions, which supports efforts to protect local cultural expressions and disseminate them on a global scale. A glance at the reports submitted in 2012 in support of the convention shows that a global perspective has become key to many cultural policies.

“Protecting local culture and ensuring diversity is a way of accommodating globalization, not resisting it.”

Cecile Duvelle
Chief of the Intangible Heritage Section at the United Nations Educational, Scientific and Cultural Organization (UNESCO)

A policy in Brazil, for example, helps the country’s film producers learn to work with international industry standards so they can build partnerships and access financing on an international level. In France, meanwhile, a policy to encourage the promotion and maintenance of cultural diversity in the book sector – including both digital and physical formats – has become a model for similar policies across Europe and in Latin America.

“There is a trend towards keeping very rooted, local traditions alive with a sense of being part of the global village,” Duvelle said. “Even in the developed countries, there is a sense of keeping in touch with traditional music, artifacts and customs. And it’s clear that this is not just for older people; it’s for the young people because they’re very proud of their culture. They want it to be recognized and they’re proud to be part of this practice as citizens of the world. The United States is a great example of a nation which has a lot of diversity, and it recognizes the diversity within its territory while having a sense of national identity. It keeps a very strong sense of cultural identity even with all the different languages spoken.”


Notably, the UNESCO convention focuses not only on the protection of cultural diversity, but also on its promotion in any medium. In the endeavors of various communities, the notion of diversity is built on the acknowledgement of difference and exchange. Both of these elements are fostered on a global scale by digital technologies, and this is changing the trade environment of cultural output.

In trade negotiations, the focus has now shifted to electronic commerce, especially the electronic transfer of broadcast services. “Different parties might agree to cultural exemption, quotas, subsidies and so on, but when they negotiate on electronic commerce they will insist that everything needs to be liberalized,” Vallerand said. “Recognizing this new form of trade, which is the future of anything to do with broadcasting, as a product and not as a service will lead to further trade liberalization.”

The full effects of this increasingly globalized, digital world on cultural exchange are yet to be seen, but there is a clear emphasis on the need to embrace diversity, with little room
for an isolationist approach to cultural identity.

“Globalization has given us access – through new communication channels or through travel – to cultures that we couldn’t get close to some 20 or 30 years ago,” Duvelle said. “Cultural diversity will not take care of itself, and if we don’t take care of it we may be surprised to find one day that some things have disappeared. But culture and globalization are not mutually exclusive, though there is sometimes a tendency to think they are. Protecting local culture and ensuring diversity is a way of accommodating globalization, not resisting it.” ◆

Cloud computing

Renting hardware and software comes of age

Dan Headrick

5 min read

The cloud, like many new technologies transforming business and society, is gaining significant traction. With cloud’s clear advantages in terms of costs and flexibility, the biggest challenge most companies face now is deciding how to integrate the cloud’s varied services with existing operations.

For the vigilant CEO lying awake at night, grappling with the challenges of ever-shifting, continually accelerating, rapidly evolving business cycles, cloud computing offers a useful advantage: the flexibility to learn, adapt and even fail quickly so the company can minimize its losses and move on to the next opportunity.

“It’s OK to fail as long as you’re learning and moving forward,” said Shawn Rogers, vice president of Research, Business Intelligence and Data Warehousing at Enterprise Management Associates (EMA), a cloud services company head­quartered in Boulder, Colorado, USA. “But you can’t take two years to fail. Failure is not the scary thing. Failing slow is the scary thing.”

Cloud computing, Rogers said, allows companies to move more quickly than previous computing models to deploy new strategies and better compete in a rapidly accelerating, digitally enabled, data-driven business world. That speed is critical to success in an environment where new competition can pop up almost overnight and market demands shift almost as quickly.

Of all that cloud computing promises – elastic information technology (IT) infrastructure, fast deployment of data and applications, greater productivity, improved cost control, support for mobility – being able to quickly execute new strategies and then determine their success is no small advantage.


When US-based cosmetics maker Revlon launched a program to globalize its operations, Senior Vice President and Chief Information Officer David Giambruno set about to virtualize nearly all IT functions with an outside cloud vendor. His concern wasn’t that the company’s IT budget (2% of its US$1.4 billion revenue) was too high, Giambruno said in an interview led by Revlon’s virtualization vendor. He just wanted to make Revlon’s operations better and simpler.

“Today, most companies understand the cloud has revolutionized I.T.”

Stephane Maarek

Revlon transferred more than 500 IT applications representing 98% of its global workload to a private cloud service in 2011. Since then, the move has saved US$70 million, triggered a 300% increase in project throughput, reduced downtime to zero and cut data center energy use by 70%. For sheer agility, Revlon’s cloud-based operation proved itself when its data center in Venezuela was destroyed by fire. Revlon switched its IT operations to New Jersey in just two hours.

Cloud comes in so many varieties that calculating comparative costs of owning on-premise versus cloud-based IT can be complex. For example, global IT research analyst Gartner estimates the annual cost of owning and managing software applications can be four times the cost of the initial purchase. However, the Total Cost of Ownership (TCO) of cloud-based Software as a Service (SaaS) looks beyond the initial investment in SaaS licenses to include costs of customization, scalability, integration with existing applications, training and support, making comparisons virtually impossible.

Renting hardware on the cloud tends to be more straightforward, about one-third the cost of purchasing and maintaining the equipment in-house, according to global management consulting firm McKinsey Global Institute (MGI). No wonder the Cisco Global Cloud Index estimates that global cloud traffic will grow by a factor of six by 2019


The concept of cloud computing has been around for nearly two decades. was founded on the cloud 15 years ago. Workday, a human resources cloud service company, was established in 2005, and Amazon Web Services opened for business in 2006. But technological improvements, including greatly increased network speeds and bandwidth, are helping cloud computing achieve critical mass. According to an MGI report published in 2013, most IT and web applications and services could be cloud-delivered by 2025, with an economic impact of as much as US$6.2 trillion.

The challenge for CIOs begins with deciding when and how to invest in the cloud. From there, executives must manage the transition of company operations out to the cloud, which means transforming processes and redefining the roles and responsibilities of IT personnel.
“As IT departments have grown with growing companies, they have found there’s not enough staff to do everything in-house,” said Stephane Maarek, vice president of Business Development at Outscale, a cloud services company based in France. “It’s difficult to find the talent. The cloud frees up talent because you can stop building in-house stuff you don’t need to build in-house. You can instead use your staff to do strategic planning.”

US$6.2 trillion

MGI estimates that by 2025, the economic impact of cloud-delivered IT and Web applications and services will reach          US$6.2 trillion.

Jeanne Ross, director and principal research scientist of the Center for Information Systems Research at the Massachusetts Institute of Technology (MIT) Sloan School of Management, sees other advantages in the cloud. “One great thing about cloud is it forces discipline on you because you can’t customize the software,” Ross said. “You can configure, but not customize. It’s amazing how much faster it goes when you can’t change it. It’s too bad that so many people are afraid of cloud for security reasons. Most companies that have moved fairly aggressively have found cloud security is much better than their own security was, because cloud providers update it all the time.”

According to a 2013 global survey conducted by Cisco in partnership with Intel, titled “Impact of Cloud on IT Consumption Models,” data security is the Number One concern about cloud, but also one of the top three business drivers for cloud adoption, indicating that respondents are cautiously optimistic that cloud providers can do a better job at securing their data than any single corporation can. The results of the survey of 4,226 IT leaders in 18 industries and nine major economies also indicate that robust security and data-protection capabilities are the most critical factors for cloud-service providers to succeed in winning the respondents’ business, creating a major incentive for cloud providers to deliver on the executives’ optimism.


Some industry watchers have described cloud computing as really more like sky computing, where lots of clouds float around in the form of distinct outsourced IT service categories that can be tapped as needed. These include:
• NaaS (Network as a Service): Users get access to connectivity services, including flexible and extended virtual private networks (VPN) and bandwidth on demand.
• IaaS (Infrastructure as a Service): IaaS provides computers and other resources, including data storage, software, firewalls, and IP addresses, from large physical data centers connected by the Internet or VPNs.
• SaaS (Software as a Service): Users buy access to application software and databases. SaaS is usually priced on a pay-per-use basis with a subscription fee.
• PaaS (Platform as a Service): With access to operating systems, databases and web servers, users can work without the cost and hassle of buying and managing their own hardware and software.

“One great thing about cloud is it forces discipline on you because you can’t customize it.”

Jeanne Ross
MIT Sloan School of Management

Clouds also come in public, community and private versions: Public when cloud services are accessed over a public network such as the Internet; community when a group of organizations shares cloud infrastructure; and private when the cloud architecture is devoted to a single organization. Hybrid clouds combine two or more private, community or public clouds.

“Ten years ago, people were still trying to wrap their minds around the idea of the cloud,” Outscale’s Maarek said. “Today, most companies understand the cloud has revolutionized IT. Beyond all the cost savings, the other thing that is revolutionary about it is that the cloud opens doors to new scenarios. It helps companies invest in projects they otherwise couldn’t invest in. It helps companies fail faster and succeed faster. It just accelerates everything.”

While Maarek acknowledges that many companies remain cautious about how best to approach cloud services, the overall trend toward adoption is unmistakable. "Once people start going into the water, everyone follows."


As inexorable as the cloud trend appears, executives should be prepared to ask important questions as they consider the many providers delivering cloud services. EMA’s Shawn Rogers offers his Top 10 points to consider:

• Pricing: In terms of total cost of ownership, cloud can sometimes be more expensive than in-house solutions, so do the math and compare.

• Architecture: Understand it. Small cloud companies sometimes struggle to match features with a company’s needs.

• Service Management: Consider service agreements carefully.

• Transparency: “I like cloud companies that are willing to tell you when they fail,” Rogers said. “Expecting cloud companies to be perfect is not reasonable.”

• API: Application Programming Interface, which addresses the quality of application-to-application communications.

• Security: Cloud companies have made terrific strides in meeting these legitimate concerns, but do your due diligence.

• Proof of Concept: Ask for free trials, plus scale and integration testing.

• Training: How will a provider support it? With seminars, on-site training or conferences?

• Vendor Strength: Look at the provider’s management structure and organization. Make sure you’re confident in the company’s stability.

• Exit Strategy: “People never think about leaving, but there are issues with leaving,” Rogers said. “I have seen agreements with penalties for leaving.”

Stay up to date

Receive monthly updates on content you won’t want to miss


Register here to receive a monthly update on our newest content.