Simulating humans

3D modeling opens new research pathways in medicine

Lisa Rivard
3 March 2014

3 min read

Increasingly, medical researchers are using sophisticated modeling and simulation software to collate and capitalize on vast stores of biological data and generate new insights about the complex systems and physiological interactions of the human body. From modeling bone and muscle to simulating the human heart and brain, groundbreaking new treatments are on the horizon.

Today, advanced modeling and simulation technologies are allowing medical researchers to create “computational” models of even the most complex human systems such as the heart and brain. Using mathematical algorithms, patient data and simulation-based predictive computational techniques, researchers can simulate, predict, and understand the most intricate subsystems of human organs and systems, providing new opportunities to develop improved treatments.


No one is more surprised at these advances than Dr. Julius Guccione, a professor and cardiothoracic surgery researcher at the University of California, San Francisco, School of Medicine (USA), who initially believed many of the human body’s systems were simply too complex to model. Yes, simulation and biomechanics have been widely used in orthopedic surgery for treatments such as knee and hip replacements, but bones are structural components akin to the mechanical structures designed by manufacturers. Human organs, Guccione believed, were an entirely different challenge.

“Modeling and simulation have been used very successfully in the design of automobiles and airplanes, but many of us viewed the heart as being orders of magnitude more complicated than any other engineering structure being tackled,” Guccione said. “But when I saw firsthand the complexity of the mathematical models used in these other industries, it really changed my mind. The detail and size of the models and the computational time is very, very impressive.”



Heart fibers undergo extreme deformations, with at least 20% elongation and contraction throughout the cardiac cycle, Guccione said. “It’s a much more nonlinear problem than modeling human bone, where any deformation is difficult to detect,” he said. “It’s unusual for a solid to have such extraordinary material properties.”

Today, however, Guccione and his colleagues are using mathematical modeling and simulation techniques to test the safety and efficacy of a biopolymer gel injection designed to strengthen and stabilize the human heart during episodes of heart failure. Using these techniques to understand the mechanism of individual patients’ hearts will allow researchers to tailor the amount of gel and the administered location to achieve the most beneficial effect for each patient.


Another important frontier for the use of human modeling and simulation technologies is in regulatory approval. Dawn Bardot, senior program manager for modeling and simulation at the Medical Device Innovation Consortium (MDIC), a public-private partnership that aims to advance regulatory science for the medical device industry, said that medical device manufacturers use modeling and simulation extensively to sort through competing designs and examine field failures. But they have not yet gained the credibility required by regulators for making safety and efficacy decisions.

Developing model and simulation techniques to the regulatory level will require new approaches and a collaborative effort to better understand complex inputs such as material properties, human anatomy, physiological conditions and disease states, according to MDIC. In parallel, researchers such as Guccione’s team focus on developing systems-level models.

“We really don’t have good cartography of the human body’s variability from an anatomical, physiological or material property perspective yet,” Bardot said. “But when we have those system-level models and simulations, we can move away from bench-top tests that attempt to represent something about a human, eliminate animal testing, and reduce the number of clinical trials and patients involved in them.”


Researchers like Guccione and his colleagues are on the leading edge of a new push to better understand the biological pathways and systems that connect and drive the intricate systems of human body, said Dr. Alan Louie, research director at IDC Health Insights, based in Framingham, Massachusetts (USA). “We’re very early in understanding the interactions of the different biological pathways, systems and organs of the human body – and they’re very different than the physical systems, such as bone and skeletal muscle,” Louie said. “But we’re beginning to be able to simulate some of these models.”

The research is still in the early stages of development. “Translating a model into a computational-based system is, to a large extent, ‘linear’ thinking, and some of these complex systems cannot be explained using linear thinking,” Louie said. “The ability to bring the right pieces together is the hard part.”

Many new tools are emerging that have the potential to deliver some value in this space, he said. One well-known example is Watson, the IBM supercomputer best known for beating the best human competitor in the US knowledge game show “Jeopardy.” “Watson takes stores of knowledge as decision-making points and overlays that onto available data sets,” Louie said.

Watson helps researchers overlay health data, including electronic medical records, with genomic discoveries, population biology data, social media insights on the comparative effectiveness of drugs, and other medically relevant data from diverse sources. Watson then filters those inputs through a system that identifies commonalities.

“Simulation of human systems is coming,” Louie said. “It relies on systems to connect vast amounts of diverse and far-flung data, accommodate nonlinear thinking and capture the immense complexity, but the ability to visualize all of these interactions is going to be a very powerful enabler.”

Subliminal messages

Color influences consumer choices

Lisa Rivard

3 min read

The color of products, their packaging and their surroundings can have a major impact on their success in the marketplace. Consumer packaged goods manufacturers continue to refine their color choices, working hard to deliver the intended message about their products.

According to research conducted by The Color Institute, people make a subconscious judgment about a person, environment or product within 90 seconds; between 62% and 90% of that assessment is based on color alone. In fact, Web analytics firm Kissmetrics reports that 85% of shoppers cite color as the primary reason for buying a particular product.

“Fifteen years ago, you’d have to do a real selling job to convince people that packaging color was important,” said Leatrice Eiseman, executive director of the Pantone Color Institute in Carlstadt, New Jersey (USA). “But I think we’ve seen things happen in the marketplace, such as the colorful Apple product lines, that show us the psychology of color is hugely important.”


Color is a marketer’s first tool in communicating across all human emotions, said Leslie Harrington, executive director of The Color Association, a color forecasting firm based in New York City (USA). “Before someone experiences a product in other ways, they see it … and that’s all about color,” she said.

Humans have three “relationship” levels with color, Harrington said. The first is universal; on this level, everyone reacts to a color in the same way. For instance, red is universally associated with passion, while blue is calming. The second level is societal or geographical. For instance, people in China react to pink differently than people in the Americas do. And the third level is an individual relationship based on life experiences, which may override the other two levels.

As humans move through life, Harrington said, they collect color associations that affect their subconscious reactions. “The reality with color,” she said, “is that the consumer intuitively knows if it’s wrong. Anything that comes off as wrong, they just won’t buy.”

COLOR EMOTION GUIDE: How color impacts consumer behavior


Marketers must educate themselves about how color “attaches” itself to the products they’re trying to sell, Eiseman said. “You have to understand the message you are trying to convey or the mood you’re trying to set. It’s also critical to know your competition. You don’t want to blend in with the next can of soup. And last, you have to be conscious of building a brand image so distinctive that no matter what other colors you use, there’s still recognition of your company’s persona.”

Sometimes, delivering a product’s message effectively means doing the unexpected with color. When Kimberly-Clark introduced “U by Kotex,” a new line of products aimed at younger women, it decided to step away from the category’s traditional pastel-colored packaging with bright fluorescent wrappers packaged in striking black.

Kim Greenwood, senior manager of virtual reality at Kimberly-Clark, said the packaging conveys exactly the young, fresh message her company wanted for the product. The award-winning design is paying off in sales and market share.


Laying claim to a color can be a game changer for a corporate brand. Harrington said her group found that, with the exception of primary colors, more people know what a color does not convey than what it does convey. And that, she said, creates an opportunity for a company to ”own” a color.

Case in point: the package shipping giant United Parcel Service (UPS), which hung its branding on the color brown. “When consumers don’t know what a color stands for, companies have the opportunity to make it stand for something,” Harrington said. “That’s how UPS did the whole ‘What can brown do for you?’ campaign. They made brown about friendly, good service delivered by sexy guys in brown shorts and knee socks. No one else in their industry can touch the color brown now.” Given the importance of color, Eiseman said, most companies are turning to color experts for help in interpreting and acting on color information. In the process, Harrington said, companies quickly realize two things: how difficult color decisions are, and that they are making them too late in the design process.


of shoppers cite color as the primary reason for buying a particular product. KISSMETRICS

“Color considerations are usually part of the secondary thinking,” Harrington said. “But more and more, color has to be moved up to a strategic level within the conversation. Once you’ve identified brand attributes and personality, you should immediately consider the colors related to that. Make them part of the thinking process right up front. Every single thing we make has a color. Why not make the decision about color a smart and strategic one that can be leveraged as part of your product platform?”

Integrated organizations

Working as one improves industrial-equipment makers’ competitiveness

Nick Lerner

2 min read

With increasingly sophisticated products, supply chains and customer expectations,industrial-equipment manufacturers are experiencing intense pressure to deliver. Compass spoke with Cambashi founder Mike Evans about strategies to manage the industry’s growing complexity.

COMPASS:  What challenges are industrial-equipment makers facing?

MIKE EVANS:  Machine builders must accelerate time-to-market of new machines while reducing costs and ensuring greater safety and reliability. They must also comply with strict standards and stringent regulations.

Customers also demand that today’s machines do more than ever before, with advanced monitoring, sensors and automation bringing ever-increased functionality. This increased complexity is challenging to manage and makes it harder to compete effectively.

Like most businesses today, industrial-equipment makers also have to connect within and beyond their enterprises on a global scale. Many companies that internally managed sales, design, engineering and installation now outsource these specializations. But coordinating multiple streams of information is problematic. If these streams remain unconnected, labor and capital are wasted through repetition of work and errors in engineering and manufacturing.

How can industrial-equipment manufacturers manage complexity to deliver a better customer experience?

ME: Companies are moving to customer-centric teams where skills and tasks are organized around customers’ needs. On behalf of their customers, they have to make and install machines that have 100% operational uptime.

That’s a big order. What strategies can help manage it?

ME: It definitely requires an integrated working methodology that allows customers to see exactly what they are buying, often simulated in the context of their own operations before the machines are built. That leads to fewer mistakes and more satisfied customers. Within an integrated working methodology, all parts of the enterprise touch. Customer, supply chain, product data, mechatronics and software are all combined on one platform used by each and every stakeholder.

What are the commercial and technical advantages?

ME: Synchronizing requirements drives the efficiency, which allows more functionality to be incorporated into machines while delivering improved output. This virtuous circle is completed when improved profitability allows greater investment to replace labor with capital.

Technology is the enabler. Properly deployed, advanced and integrated technology leads to greater precision, remote monitoring and machines that can, for example, automatically compensate for wear to improve accuracy. These advances lead to more output from machines for the same or lower cost. Connecting the activities of the extended enterprise also eliminates waste. An engineer may be able to support more than one site, for example, because they can oversee and optimize machine operations and maintenance remotely.

What types of companies can best use this type of integration?

ME: Many people assume that this is exclusively the preserve of large companies, but small companies and startups with little or no structure in place can also reap the benefits. There are always opportunities, and the rewards can be revolutionary in terms of output, productivity and competitive advantages.


What about advances in the machines themselves?

ME:  Mechanical, technological and workflow interfaces are becoming standardized, while machines and the systems and people that make them are fitting together more effectively. Academics are actively looking at how software, electronics, pneumatics, and connections between machines, systems and people can be further integrated. This is a major challenge that, when fully solved, will ensure absolute consistency and predictability across disciplines.

What do machine makers need to do to position themselves for these trends?

ME:  Implement a strong organizational foundation that touches all functions. Machine builders must organize themselves to take advantage of technology as it becomes available. Companies that have implemented policies that join their organizations seamlessly with their products are best positioned for future success.

Mike Evans, the founder of Cambashi, specializes in the economic impact of software applications on engineering, manufacturing and automation. His company, which integrates the activities of a management and marketing consultancy, an industry analyst firm, and a market research company, works as an independent advisor to leading IT companies and government agencies worldwide.

Innovations in investing

Radical changes in market conditions require new solutions

William J. Holstein

4 min read

Amin Rajan, chief executive of CREATE-Research, a research boutique based in London, UK, works with financial institutions in general and asset managers in particular to improve their innovation processes. Rajan’s latest study focuses on innovation challenges for asset managers who invest US$65 trillion in assets worldwide, yet lag in innovation. In an interview with Compass, Rajan spelled out his recommendations for adapting to this new era.

COMPASS:  How is the asset management business doing?

AMIN RAJAN: The industry has been in a crisis mode for 15 years. It has had two significant bear markets, which were two of the four worst bear markets in the past 100 years. Investors are scared. Their environment has changed radically. The 24-hour news cycles amplify investor paranoia and glee. With globalization, when something goes wrong in one country, it affects many others. The old style of investing isn’t working. Clients are saying, “We need new types of products. We can’t cope with this feast-and-famine mentality that has prevailed since the collapse of Lehman Brothers.”

As investors get closer to retirement, are they becoming more demanding?

AR: It’s quite startling. Around 75% of assets held by retail asset managers are owned by people who are either retirees or near-retirees. Their risk profile is very different. They are saying, “I want assets that give me income and low volatility. I want certainty rather than volatility.”

What can asset managers do to innovate in response to these needs? 

AR: In some cases, they can offer better versions of old products. They also have to engage with their clients much more than in the past. Market dislocations create price anomalies or mispricing. In that kind of environment, it is possible to convert volatility into gain. Asset managers also need to help investors to avoid herd instincts. That has killed many portfolios.

It seems you’re suggesting that asset managers have to adopt a more activist style.

AR: Right. The old way was to “set it and forget it.” But because of the volatility, that style doesn’t work anymore. Retirement planning is not a one-shot decision like buying a house. Jobs change. Family situations change. You have to be savvier and to adapt, especially when markets are moved more by politics than economics.

[image 1134 class="large"]

What about creating new products?

AR: You can create new products, or you can update existing ones. If you are selling a fixed annuity, how about linking it to price inflation? Or how about having a draw-down product? A customer invests only half of his or her portfolio into an annuity. The other half is invested in a draw-down product where funds are reinvested to deliver regular income and inflation protection. The eggs are spread among different baskets.

Do you have suggestions for managing the risks involved in creating new products?

AR: The way it has worked in the past is that a salesman met a client and found an interest in a particular product. The asset management team put the product together and then they sold it. They didn’t do stress testing. They didn’t do market research. They just went for it. But a good process will develop a business case around the product, then do a proof of concept. This is like putting a new car through a wind tunnel. A product should be tried and tested. Perhaps it should have a “paper portfolio” and run it for six months to see how it works.

So asset managers haven’t had the benefit of fully developed innovation processes, like technology companies do?

AR: No, which is why asset managers have to move in the same direction as 3M and Johnson & Johnson. They have to develop a very robust process before they bring anything to the market. They need to create a bank of ideas submitted by people from every part of the organization. The ideas have to be evaluated by peers. The good ones are escalated.



But the financial industry attracts highly individualistic people who prefer to come up with their own solutions, doesn’t it?

AR: Yes, but that is changing, because the asset management business is becoming more industrialized. It has to blend ideas with processes. The craft nature of the business is giving way to process discipline. It is not a cottage industry anymore.

What role does tougher regulatory scrutiny have?

AR: More regulation means you can’t just come up with an idea. Now, it has to be approved via a robust process. Here in the UK, before I register a product with the regulators, I have to show them all the thinking that has gone into its creation. There also is more stringent enforcement of existing rules. Regulation is a double-edged sword. It is slowing things down, but it is also ensuring greater rigor.

Your report observed that there are hurdles, or “blockers,” that discourage innovation. What are they?

AR: In cultural terms, we see the prevalence of a “permission” culture, where people are reluctant to think outside the box. There are also dysfunctional tensions between members of innovation teams – especially between portfolio managers who understand market dynamics, and sales people who understand client psychology. There is also an absence of tools that promote real-time collaboration among people.

So what is your advice to asset managers?

AR: The first step is to tackle the factors that act as blockers to innovation. It is easy to identify these barriers, like the syndrome where people are not open to ideas they didn’t invent. And you have to make sure that processes are in place. I hate bureaucracy, but I know we must go through certain processes to ensure the integrity of our products. Lower or eliminate the barriers to good ideas, give people the tools to collaborate, and put in place the processes.

Connecting the dots

Regulatory strategies now and in the future

Lisa Rivard

3 min read

As the global population ages, demand to bring safe yet innovative new products to market faster and more affordably is growing, challenging life science companies and regulatory authorities alike to reinvent approval processes for development of drugs and devices.

According to the US Bureau of Labor Statistics, today’s pharmaceutical and medical device industry is spending twice as much on research and development to produce just 40% as many new treatments as it did a decade ago. Faced with dwindling pipelines, the industry is striving to reinvent itself – and regulators are lining up to help with streamlined review processes for new treatments.

Expert observers recommend the industry focus on its development processes to reduce the time and costs involved in bringing new medicines and devices to market. For example, Dr. Scott Gottlieb, the US Food and Drug Administration’s (FDA) deputy commissioner for Medical and Scientific Affairs, singles out the empirical, statistical method that currently dominates healthcare R&D for being inflexible, restricting innovation and resulting in “overly large” clinical trials that yield information about large populations, but not about individual patients.


Clinical tests and measures known as “biomarkers” can be used to group patients with related-but-distinct conditions, enabling drug makers to better target their clinical trials and cut development costs by as much as half, global consulting firm Pricewaterhouse Coopers (PwC) observes in its study “Pharma 2020: The Vision.” Thanks to such improved, targeted patient information, PwC believes that by 2020, all medicines will be approved on a real-time basis with highly restricted “live licenses.”

Live licenses will be contingent on extensive in-life testing in small populations of patients, then extended to larger numbers of patients or different populations as a medicine’s safety and effectiveness is proven. “Every medicine on the market will have a prearranged, fully automated pathway through its lifecycle, and its development will be a continuous process rather than ending when it is approved,” the consulting group predicts.

But such a transformation will require the support of regulators, coupled with changes in the legislation governing new medicines and devices. “Regulators will insist on greater collaboration and expect to be consulted on a regular basis from a much earlier point in development,” PwC says.


Regulatory authorities are under great pressure, particularly with regard to safety, says Baba Awopetu, European marketing director at Leica Microsystems, based in Wetzlar, Germany, and a global leader in innovative imaging systems for surgical applications. “First and foremost, they need to be seen as looking after the public.”

But Awopetu believes there must be a balance between safety and an emphasis on bringing new, innovative treatments to the market when their benefits to patients outweigh their risk profiles. “If we want to tackle some of the nasty diseases of the 21st century, we have to be willing to tolerate some risks to bring really good innovations to the market,” he said.

The challenge for regulators, Awopetu says, is to find a balance between caution and innovation. PwC agrees: “Regulators are increasingly looking for evidence that new medicines are not just safe and effective, but better than comparable existing therapies.”


To that end, regulators around the world are collaborating more than ever before. “There’s a move these days toward a more consortia-type approach to drug development,” says Dr. Kenneth Kaitin, professor and director at the Tufts Center for the Study of Drug Development in Boston, Massachusetts (USA).

Stealing a page from the industry’s focus on partnerships and strategic alliances, Kaitin says, regulatory agencies are collaborating to share information and review data. “There’s an awful lot of data out there that’s being reviewed by sister agencies, and they are talking more about it,” Kaitin said. “Moreover, because the types of applications agencies are being asked to review are becoming more and more complicated, I think that going forward the level of inter-agency collaboration to resolve questions will only increase.”

The European Medicines Agency (EMA) in London, UK, which is responsible for the scientific evaluation of pharmaceutical products for use in the European Union (EU), already maintains a database of all clinical trials conducted in the EU, which could become a global platform to ensure transparency of all trial data. Meanwhile, the US FDA is investigating an entirely paperless submission process, as well as an electronic exchange for sharing clinical research information.

“There’s a lot of interest in harmonizing among international regulators,” said Dr. Chris Milne, director of research at the Tufts Center for the Study of Drug Development. “Let’s face it. Diseases don’t recognize geographic borders. Vigilance is a shared goal among the major regulatory agencies. There’s a general desire and movement toward a common framework that will provide a platform for innovation.”


The pharmaceutical and medical device industry is spending twice as much on research and development to produce just 40% as many new treatments as it did a decade ago. US BUREAU OF LABOR STATISTICS

Regulatory authorities also will be working more closely with the companies they regulate, Kaitin predicted. There’s a growing conversation between companies and the US FDA, for instance, about new approaches, such as improved clinical trial design, he said.

“It’s in the interest of both groups to create a more efficient drug development process to speed development times and increase success rates for new products,” Kaitin said. “In the US, the FDA has been given a green light by Congress to engage and work with industry to help them do a better job of bringing new medicines to market. Ultimately, that helps patients, too.”

Fully digital

Banks tackle do-or-die automation challenge

Charles Piggott

3 min read

As regulators demand transparency and shareholders push for improved efficiency, financial services companies still rely on cumbersome manual processes in many key aspects of their business. To make all of their activities as efficient and visible as routine transactions, banks need unified systems that can handle both processes and documentation digitally.

Every day, up to US$74 trillion moves across a financial supply chain on which billions of people in 190 countries depend. Yet despite enormous success in straight-through processing (STP) of core payments and transactions, banks have struggled to fully automate many basic administrative processes, from regulatory procedures to new product development.

Changing customer, regulatory and shareholder expectations have fired the starting gun, however. Banking supervisors want greater transparency, compliance and oversight; shareholders demand cost-efficiency gains; and digitally sophisticated consumers seek seamless service across multiple channels.

“Events in 2008 were definitely a trigger point,” said Kevin Sullivan, head of technology architecture at State Street Global Advisors, the world’s second-largest asset manager, in reference to the failure of Lehman Brothers, which nearly toppled the global financial system.

“Those events brought higher levels of regulatory oversight that have made risk and audit controls a clear priority, and this means higher levels of automation.”


The benefits of robust digital systems are many: improved transparency, traceability and data access; reduced potential for error and manipulation; increased innovation; reduced time to market; and significant cost savings.

“I would say we’re looking at possibly 35% to 40% of the remaining manual processes in banking that could be automated, particularly where interaction between different systems involves manual input,” said Nancy Atkinson, senior analyst at Aite Group, an advisory firm focused on financial services technology and regulation. “So there is plenty of room for improvement.”



Others put the figure even higher. “Even though we have already taken a large number of manual instructions out of the system, I would say roughly 50% of the remaining manual processes could be automated,” State Street’s Sullivan said. “By reducing the number of people that need security access to perform manual tasks, automation has also increased security. I would say we’ve reduced security risks by more than 30%, maybe even 50%.”

Jan Paul van Pul, head of market infrastructures for Netherlands-based ABN Amro, said that the majority of current IT investment in financial services is being driven by regulations. “Something like 50% to 70% of banks’ strategic IT investment is targeted on areas related to regulatory initiatives,” he said. “The rest is on optimizing costs, improving cost/income ratios and driving innovation.”


Many core banking processes still involve a high degree of manual work – especially where operational and management processes, regulatory and legal compliance, and governance are handled informally. Such informal processes make forensic investigations costly and time consuming. Trying to reconstruct decisions made with calls, emails and faxes, using spread sheets that have been modified with little or no record of what was changed, why and by whom, also exposes decision-makers to significant risks.

However, a bank whose executives can collaborate on a single platform governed by strict digital protocols and embedded documentation procedures, including digital signatures, is more likely to inspire regulators’ confidence. Using more extensive digital processes also offers advantages in customer service and understanding.

“Paper is being driven into a corner,” said Sailesh Panchal, head of function: Payments Design at UK-headquartered Lloyds Banking Group. “It is now very hard to introduce any new process that includes a manual element. There are very strict group guidelines against that, and you have to go through a governance process giving clear justification.”


Lloyds Banking Group has invested heavily in re-engineering its IT systems. “In 50 or so cases, upgrading to integrated IT systems involved a full rip-and-replacement of existing services,” Panchal said. “Some of the results have been dramatic; for example, account opening and closing times have been cut from hours to almost instantaneous.”

Justifying IT projects during a period of economic uncertainty has been challenging. “What we found was that if you take each process on an individual level, the numbers just didn’t stack up,” Panchal said. “But when we started grouping them together we found capability clumps, and this is where we are making progress.”



Banks that become fully connected and digitized can reap the value of superior transparency, information and insight while eliminating the risks and costs of disjointed, untraceable manual processes. In five years’ time, banks like ABN-Amro, State Street and Lloyds that invested early in digitizing operational processes will be the winners – and those that did not automate risk being the losers.

Charles Piggott is managing editor of Global Risk Regulator, a Financial Times publication on bank supervision and regulation. His articles have also appeared in The Banker, EuromoneyBusinessWeek, The Independent and The Wall Street Journal.

Underwater archaeology

Digital technology helps divers explore lost ship of Louis XIV

Dora Laîné

3 min read

Using 3D digital reconstruction and simulation, marine archaeologists are revolutionizing the way they prepare for deep-sea missions – and creating remarkable views of neverbefore-seen archaeological treasures. Compass talked to marine archaeologist Michel L’Hour to discover how the technology is being applied to a ship that sank during Louis XIV’s reign.

As many as 200,000 archaeological sites lie off the Mediterranean coast of France – a window to past cultures and part of an estimated 3 million underwater sites worldwide. France’s DRASSM (Department for Subaquatic and Underwater Archaeological Research) and its director, Michel L’Hour are responsible for studying and protecting the French sites – and three-dimensional (3D) simulation technologies are helping to make their work safer and more productive.

“Marine archaeologists face unique challenges, principally due to the great depths at which many sites lie,” L’Hour said. “Reaching an underwater site requires sophisticated equipment and care to ensure all necessary and appropriate safety measures are taken. We also have a responsibility to preserve wrecks and their artifacts from unscrupulous looters.”

One important wreck is the Lune, a French royal warship sunk during the 17th-century reign of Louis XIV. For nearly 350 years, the wreck lay hidden beneath 90 meters (300 feet) of water off the coast of Toulon – until Titanic specialist Paul-Henry Nargeolet accidentally discovered it in 1993.

It took nearly 20 years for L’Hour’s team to raise funds for a Lune expedition and build an advanced underwater research vessel, the André Malraux. In 2012, L’Hour and a team of marine archaeologists, assisted by divers of the French Navy and engineers specialized in 3D virtual technologies, were finally ready to unlock the Lune’s secrets.

For L’Hour, a marine archaeologist with more than 100 underwater expeditions to his credit, “the shipwreck is an underwater museum of 17thcentury maritime istory. The Luneis perfectly preserved and exists underwater as it was on November 6, 1664, when it embarked on its final and fatal voyage.”




During the 17th century, ships navigating the Mediterranean Sea were routinely attacked by pirates. In 1664, France sent 15 vessels and 9,000 men to combat them in Algeria; the fighting was so fierce that four vessels, including the Lune, were ordered to evacuate the troops. Although the ship was taking on water, it arrived safely in Toulon. But fear of the plague forced the Luneback to sea. It sank on the voyage to quarantine in Porquerolles, killing all 800 onboard.

While some wrecks can be explored with scuba gear, the apparatus is limited to depths of 50 meters (164 feet) or less; the Lunelies nearly twice that deep. “Sending a human being to the Lunepresents significant hazards, which include breathing gases at high pressure, which can impair mental and motor activity, and the risks linked to decompression,” L’Hour explained. “Moreover, currents cloud the site with floating sediment, hampering visibility. To explore the Lune, we needed more advanced equipment, exploration techniques and specialized divers.”


The André Malrauxis specially equipped with advanced technology, including a remote-controlled robotic vehicle. L’Hour’s archaeologists also were assisted by the French Navy and their Newtsuit; fully articulated and capable of maintaining internal pressure at one atmosphere (equivalent to sea-level air pressure), the Newtsuit eliminates gas-mixture sickness and the need for decompression. Worn by one of five specially trained French Navy pilots, and assisted by two operators posted on the André Malraux, the Newtsuit allowed divers to reach the site and safely bring a well-preserved cauldron and other artifacts to the surface for study.

Dive and salvage procedures were carefully planned in an immersive 3D reproduction of the 500 square-meter site (about 5,400 square feet). Digital-reconstruction engineers used images and videos recorded from an Autonomous Underwater Vehicle to produce an interactive, virtual 3D reconstruction of the Lune archaeological site and its most visible artifacts. Special attention was given to detailing those sections of the wreck of greatest historical interest, including the captain’s helm
and the kitchen. “It was very impressive the way these technologies were able to clean up a photograph clouded with floating sediment to produce a crystal-clear image,” L’Hour said.

The team simulated every recovery operation, manipulating virtual objects in the 3D environment before attempting them in the real world. “Simulating and practicing the complex manipulations before sending a human underwater enabled us to reduce the risk for the diver and the artifacts,” L’Hour said. “It was a perfect opportunity to test how these new technologies can be applied to underwater archaeology.”

L’Hour dreams of one day exploring underwater sites with a robot with haptic technology that can “feel” what it touches. “The ability to distinguish a rock from an artifact at such great depths can only be enhanced through the sense of touch,” he said. “These technologies will revolutionize deep-sea exploration and provide archaeologists and the general public with insights into lost-but-not-forgotten cultures. What an extraordinary privilege.” ◆

My robot, my friend?

Ethical issues arise as robots become more human

Dan Headrick

4 min read

As robots begin to comfort disaster survivors, care for children and the elderly and serve as companions for the lonely, they increasingly offer the promise – and risk – of redefining human-machine relationships.

On August 3, 2013, a Japanese rocket ignited the pre-dawn sky at the Tanegashima Space Center in southern Japan, carrying 3.5 tons of supplies in an unmanned cargo ship bound for the International Space Station and its six-person crew. 

Packed along with water, food, parts and tools was Kirobo, a 34-centimeter-tall robot resembling a child’s doll. But Kirobo is no toy; “he” is astronaut Koichi Wakata’s companion on the Space Station. Programmed to be sensitive to Wakata’s moods, Kirobo is Wakata’s ever-willing, always-learning Japanese conversationalist, designed to ease the stress of long space journeys.

Kirobo and his ground-based counterpart Mirata, part of the Kibo Robot Project, were built by scientists and engineers at the University of Tokyo’s Research Center for Advanced Science and Technology, in cooperation with Toyota Motor, Japan-based Robo Garage, and public relations firm Dentsu, to study human-robot interaction. Why? Kirobo answered that question in Japanese during a press event: “I want to help create a world where humans and robots can live together.”


In recent decades, faceless industrial robots have transformed modern manufacturing. Other machines – airline ticket kiosks, bank cash machines, automated grocery checkouts – have replaced social interactions with technology. But robots that learn, adapt, and entice humans to form emotional bonds with them challenge the very idea of what it means to be human.

Robots have been with us for decades, primarily as laborers whose attention never wavers, who never get bored doing repetitive tasks, and who never ask for vacations. But service robots help people directly. Professional-service robots are used mostly in military applications (more than 40% of the market), along with medical, logistics, construction and underwater applications. Personal domestic-service robots are used primarily for house cleaning, yard maintenance, entertainment, education and research.

At US$1.2 billion, the global market for personal-service robots is small, but unit shipments grew by 20% in 2012 over 2011, according to the International Federation of Robotics (IFR), an industry trade group.


Robot researchers today are exploring the fullness of our human social environment: our humor and language, regional colloquialisms and cultural references, our moods and emotions, facial expressions and body language – even how we interact with the other machines we take for granted every day.




That research has fueled advances in evolutionary and adaptive learning and cognition technologies, autonomous systems with somatosensory control (a replication of the biological mechanism for everything our bodies feel) and intelligent sensing. IBM and the Defense Advanced Research Projects Agency (DARPA), an agency of the US Department of Defense, are well on their way to developing a neuromorphic chip that mimics the neurons, synapses and connections in a mammal’s brain.

Researchers also are working with soft-tissue materials that mimic skin; the fluid articulation of the face; and the nuanced shades of emotion that we, as human beings, recognize in another person’s eyes and body language. The technology is advancing quickly, so scientists and engineers have joined with psychologists, sociologists, philosophers and legal scholars to explore the implications of a not-too-distant world in which robots and humans share social relationships.


Heather Knight owns Marilyn Monrobot Labs in New York City, NY (USA), and is a robot researcher for NASA, the Massachusetts Institute of Technology (MIT) and the Carnegie Mellon Robotics Institute.

Knight believes that robots can show that most alluring of human qualities: charisma. “Ideally they will not only project confidence, deference, interest, boredom or emotion when socially appropriate, but they’ll inspire us to want to have them around,” she wrote in Wired in April 2013.

“Children with autism are very attracted to robots,” said Dr. Kerstin Dautenhahn, professor of artificial intelligence in the School of Computer Science at the UK’s University of Hertfordshire. Since 1998, Dautenhahn has studied the relationships autistic children form with robots and has concluded that the right robot can provide “a safe, predictable, nonjudgmental and enjoyable environment” for this special group.

The elderly are another group likely to bond with robots. Dautenhahn’s research examines how robots provide physical, cognitive and social assistance for the elderly in their homes, with a design focus on how to personalize social robots for individual users.

Paro, a therapeutic robot seal, was used to comfort residents of temporary housing in Kesennuma, Japan, following the March 2011 earthquake-tsunami disaster. (Image © Kazuhiro Nogi/AFP – Getty Images)




The first trials are already occurring in Japan. “Japan has the fastest-aging population in the world,” said Atsuo Takanishi, professor of mechanical engineering and director of the Humanoid Robotics Institute at Waseda University. “In fact, Japan today may be the epicenter of personal robotics research and commercial development.”


Europeans like robots, too, according to a recent Eurobarometer poll, but many observers are raising questions about society’s readiness to integrate social robots. For example, critics recently have sounded alarms about the rising number of companies that want to market robots for child care.

“I am worried about a product marketed as a child-safe robot,” Dautenhahn said. “Many parents would really believe that (claim). We can’t pretend that robots can provide the emotional needs children must have.” 

Anniina Huttunen, a doctoral student at the Graduate School of Law in a Changing World and Institute of International Economic Law (KATTI) at the University of Helsinki, Finland, studies artificial intelligence and robotics. She said the rise of intelligent systems also raises questions of intellectual property rights, ownership and liability, such as who is responsible if a robot causes injury, the owner or the robot? Current law doesn’t say.

Sherry Turkle, professor of the social studies of science and technology at MIT in Boston, Massachusetts (USA), has studied how people interact with machines for 30 years. In her latest book, Alone Together: Why We Expect More from Technology and Less from Each Other, Turkle describes an unsettling trend in which people are delegating more human relations to robots, phones and computers. We have arrived at this robotic movement, she told an audience of the American Association for the Advancement of Science, “not because we have built robots worthy of our company, but because we are ready for theirs.”


Meanwhile, space-companion Kirobo has captured the imaginations of Japanese children. By the time those children grow old, the concept of robots as companions likely will be commonplace, Dautenhahn said.

“The elderly of the future will have grown up with the iPhone, and their relation with technology will be different,” she said. “Acceptance will be more tolerant. The robot will be integrated rather than being just another gadget to operate.”

Robot designer Knight believes the integration has already begun. “We are already cyborgs, conjoined with our mobile phones, social-media identities, apps and automotive exoskeletons,” she writes. “We do not need robots as replacement lovers or friends – humans provide far more complexity. But robots make fantastic wingmen.” ◆

Cancer calculations

US teen’s computer algorithms improve cancer diagnosis accuracy

Lisa Rivard

2 min read

While some teenage girls spend their days obsessing over the latest movie release or boy band, Brittany Wenger pursues a very different passion – diagnosing, and perhaps curing, cancer with complex computer programs she created after teaching herself to write the code.

Using artificial neural networks – computer programs coded to “think” and “learn” like the human brain – 18-year-old Sarasota, Florida (USA) student and self-taught computer coder Brittany Wenger has written a program that diagnoses breast cancer and an aggressive form of childhood cancer called mixed-lineage leukemia (MLL) more accurately than ever before possible (see related article on Page 38).

Wenger’s research, which earned her the prestigious 2012 Google Science Fair Grand Prize, the Intel ISEF Grand Award and a trip to the White House to present her work to US President Barack Obama, has gone viral, prompting oncologists worldwide to contribute their samples to her database, which has raised its predictive accuracy to 99.11% and climbing.
“The whole judging panel came away with a big ‘wow,’” said Vint Cerf, widely acknowledged as one of “the fathers of the Internet” and the vice president and chief Internet evangelist at Google.
Wenger’s work also has identified small subsets of proteins (20 proteins in the case of breast cancer and just four in MLL) from the tens of thousands involved in each disease that are the best predictors of which cells will prove cancerous, information that should help pharmaceutical companies hone in on a short list of potential new drug targets.


“Because artificial neural networks aren’t told exactly how to do a specific task, they learn based on experiences and mistakes,” Wenger explained. “And since they have this learning quality, they can be ‘trained’ to recognize patterns that are far too complex for humans to detect. The implications of this pattern-recognition capacity are very exciting in every field of science.”
The program becomes increasingly accurate as it “learns” from more and more data points, so Wenger was adamant about putting her network on the Internet connected cloud. “The cloud is able to connect the world and researchers like never before,” Wenger said. “I’m able to work with a hospital in Italy, for instance, through a web service. They’re able to provide input into my program with different samples, conveniently, remotely and inexpensively. Having that power is huge.”
Wenger’s programs are being beta-tested in hospitals in the US and Italy, and she is further customizing the network for use with ovarian and lung cancer.


Perhaps the only thing more impressive than Wenger’s extensive list of secondary school accomplishments may be the promise of her future. In August 2013, she entered Duke University in Durham, North Carolina (USA), as one of eight Angier B. Duke Scholars in the class of 2017, with plans to pursue a career in pediatric oncology. The Duke financial awards, which pay the full cost of four years at the top-tier university, are among the world’s most competitive scholarships.
Although always an inquisitive child with an interest in science, Wenger’s passion for the crossroads between computing and medical research was inspired by a cousin’s battle with breast cancer and fueled by a collection of dedicated teachers and the unwavering support of her parents. “I was the kid who never outgrew the ’why?’ phase,” she said. “A lot of parents probably would have put their foot down when I was setting my alarm clock every three to four hours to test some of the programs I’ve created, but my parents have always understood how important my research is to me.”
And now, thanks to that support, Wenger’s research is offering new hope to cancer patients worldwide and new support for researchers determined to defeat the disease. ◆

R&D for hire

Corporations increasingly buy government expertise

Rebecca Gibson

6 min read

When developing a new product, many corporations will innovate in-house. Some, however, have transformed their approach and are choosing to capitalize on the abundance of technology, resources and human expertise available in government research and development agencies.

When Australian metal-casting manufacturer AW Bell decided to enter the rapidly growing North American market, the company was faced with a challenge. To tap into the aerospace and defense segments, it needed to improve its traditional investment casting process (a technique for casting alloys using a wax mold), but lacked the technical expertise and equipment to do so. Rather than spend a considerable amount of money and time developing its own facilities and employing additional staff, AW Bell decided to leverage the skills of a research and development (R&D) agency funded by the Australian government.

“Recognizing the need for high-caliber expertise and equipment, AW Bell used the Australian government’s ‘Enterprise Connect Researchers in Business’ initiative to bring the Commonwealth Scientific and Industrial Research Organisation (CSIRO) on board,” said Damien Thomas, group director of business development in Manufacturing, Materials and Minerals at CSIRO, Australia’s national science agency. “Together we developed a new technique for metal processing.”

After partnering with CSIRO, AW Bell developed its aluminum billet equivalent (ABE) casting process, which uses rapid and controlled solidification to form metal parts. “Using ABE, we can produce aluminum castings with significantly higher mechanical properties than those generated with traditional investment casting processes,” said Andrew Meek, CEO of AW Bell, which was named the preferred supplier for two major US aerospace companies after the project. “This enables our customers to design lighter parts. Our manufacturing experience, combined with the expert knowledge of CSIRO’s researchers and the fact that we were able to retain all intellectual property produced during the research stages, made this project very successful.” 

According to Thomas, AW Bell is just one of 1,600 Australian companies and 375 multi-national corporations turning to CSIRO to reap the numerous benefits associated with supplementing or replacing their own in-house R&D investments.




“Government research agencies like CSIRO are well placed in the current era of ‘open innovation’,” Thomas said. “Companies are beginning to realize that they cannot necessarily generate all the good ideas by themselves in their own research laboratories and are increasingly working with organizations such as CSIRO, their customers and suppliers to source innovation, science and technology capability.”


CSIRO is one of many government agencies worldwide experiencing an upward trend in outsourced contracts from private businesses looking to use advanced R&D facilities to enhance their position in the marketplace.

“Companies recognize more clearly than ever that their ability to compete in the international marketplace will, largely, depend on their capacity to innovate and research new technologies,” said Roger Werne, deputy director of industrial partnerships at Lawrence Livermore National Laboratory (LLNL), a security research organization of the US government based in Livermore, California. “Many corporates do not have the fundamental scientific research capabilities to produce this innovation, so they are capitalizing on the technology available in national labs to help them develop more products and increase their competitive position in the market.”

Compact Particle Acceleration Corporation (CPAC) – also based in Livermore – recently developed a four-meter linear particle accelerator by adapting elements of LLNL’s existing nuclear weapons and laser technology. Working with LLNL’s research team, CPAC refined the technology for use in proton therapy, which is used to treat cancer patients. “Through our cooperative research and development agreement, we were able to use LLNL’s team and equipment for specific tasks that required highly specialized expertise,” said Anthony Zografos, chief operating officer of CPAC. “LLNL had invented, or co-invented, many of the core technologies. By simply walking across the street, we had immediate access to resources and facilities that were essential to our work.”


CSIRO, Australia’s national laboratory, has conducted R&D for 1,975 companies, including 1,600 from Australia and 375 multi-nationals.

The particle accelerator is expected to be deployed in various US hospitals starting in 2014. By enabling doctors to treat solid tumors directly with minimal impact on surrounding tissue, the treatment is expected to reduce the tissue damage, compared to current radiation treatments.

“Our partnership with LLNL gave us worldwide recognition and credibility, while allowing us to complete aspects of the development that would otherwise have been prohibitively expensive,” Zografos added. “It was a highly efficient and effective use of our resources and brought a potentially life-saving technology to reality. It would have been impossible for us to complete this project in any other way.”

CPAC’s collaboration with LLNL also provides benefits to members of the public, whose tax dollars support LLNL’s research. “The US government has invested a considerable amount of resources to develop this technology for defense-related applications,” Zografos said. ”As CPAC’s collaboration with LLNL shows, this technology can have practical applications in other fields. In addition to giving the taxpayer access to life-saving cancer therapy technology, we expect to create a significant number of high-paying jobs as a result of deploying this technology. We expect to continue collaborating with LLNL and develop this technology for many more years to improve the performance and reduce the size and cost of the accelerator.”


Choosing to outsource R&D to a government lab with existing facilities and a large funding budget, rather than a private organization with more limited financial resources, also decreases the need for companies to invest in their own laboratories. Many companies, such as US-based metal treatment services provider Metal Improvement Company (MIC), have used this approach to reduce the risk of incurring unpredictable costs.

“National labs like LLNL develop a broad range of readily available expertise and cutting-edge equipment over the years, which is a huge asset to a commercial organization,” said Lloyd Hackel, who worked on MIC’s project to leverage LLNL’s highpowered laser-peening technology to pre-stress metal. Hackel has since joined MIC subsidiary Curtiss Wright Surface Technologies as vice president of advanced technologies.



“In the case of LLNL’s solid-state laser, the US government had spent a large sum of money advancing laser technology for specific national-security applications, so the development had already been funded and was essentially ready for a commercial application,” Hackel said. “It would have been enormously expensive and taken many years for MIC to develop the same technology and expertise.”

The use of such innovative technology, which prevents individual metal components from cracking, allowed MIC to pass on many benefits to its customers. “Using this laser-peening solution, we were able to make the blades on the Boeing 777 and Airbus A340 last longer. The Boeing 747-8 – which can also fly at a higher altitude than before – is now the world’s most fuel-efficient passenger aircraft,” Hackel said.


Established more than 70 years ago in Finland, VTT Technical Research Centre (VTT) is a government-sponsored agency that uses virtual simulation technology – and the specialized experience of its staff members – to help companies improve production processes, reduce costs and remain competitive.

“Organizations do not just need to invest in the right hardware or software,” said Kaj Helin, team leader and principal scientist in the HumanMachine Interaction and Virtual Engineering department at VTT. “They must also invest in people who have the knowledge base to exploit technology and optimize research and development processes to deliver the best possible results – and that’s where national agencies like VTT come in.”

Taking advantage of the knowledge of agency staff members – often leading experts in their fields – helps companies to innovate and extend their own knowledge and technology base, without needing to reinvent existing technology or build and staff a new research lab.

“Capitalizing on CSIRO’s years of expertise and well-managed intellectual property enables companies to increase their speed-to-market by developing existing technology at a faster rate, rather than reinventing innovation, which can take years,” CSIRO’s Thomas said.

For example, when GE International entered into a five-year, AUS$20 million (US$18.9 million) alliance with the Australian science lab in 2010, it was able to leverage CSIRO’s patented intellectual property and develop advanced technology for the healthcare, aviation and energy sectors.

“GE aims to develop solutions to some of the world’s biggest health, environmental and energy challenges, and the only way we could solve these challenges for both Australia and the rest of the world was to capture the knowledge and experience of CSIRO,” explained Steve Sargent, CEO of GE Australia and New Zealand. “Our ‘healthymagination’ and ‘ecomagination’ initiatives are driving real change globally. This investment highlights the role Australian science can play in solving the world’s toughest challenges by delivering affordable healthcare to all and mitigating the effects of climate change with cleaner technology.”

For many corporations, the benefits of outsourcing speak for themselves. As Thomas concludes: “CSIRO and others have many years of expertise and know-how. By working with government funded institutions, enterprises can secure knowledge, form partnerships, gain a competitive advantage within their industry, bring innovative products to the market quickly, and tap into global supply chains. With all of these benefits, outsourcing R&D makes perfect sense.” ◆

Reading, writing and algorithms

Compulsory teaching of computer science gains momentum

Cathy Salibian

5 min read

In an increasingly digital world, computer science is the key to global competitiveness and well-paying jobs. Yet experts agree that too few school systems around the world are preparing primary and secondary-school students adequately for careers in the field. A growing number of educators, legislators and advocacy groups, however, are working to change that pattern.

Semester after semester, Emanuel S. Grant, associate professor in the Computer Science Department at the University of North Dakota in Grand Forks (USA), takes part in an open house for high school students interested in learning about the university’s various programs.

Sitting at the computer science table, he watches some students go out of their way to avoid the table altogether; computer science retains that “geek” stigma. Other times, a male and a female student will walk in together; the male may approach the table, but the female pointedly veers away.



The hurdles continue in the second semester of college, when students enrolled in computer science begin to fall away. Even students who believe they are interested in the field typically have been exposed only to office and gaming applications, Grant said. They don’t understand that computer science includes sophisticated mathematical algorithms, data structures and computational thinking – creating applications, not just using them. When they find themselves in alien territory, too many of the students give up.

“It’s sad to see the falloff in enrollment as students realize that what they thought is computer science is not,” Grant said. “Too few schools at the K-12 level (pre-college) in this area of the state (North Dakota) teach anything truly related to the subject. Students come to us unprepared.”


Grant is among a growing number of educators, lawmakers, business leaders and concerned citizens worldwide who are working to change that pattern through legislation, curriculum development and a fresh look at educational standards.

In a world brimming with digital technologies, the study of computer science delivers both educational and economic benefits, experts say, giving students problem-solving skills and entry to high-paying jobs. According to the US Bureau of Labor Statistics, the median annual US salary in 2011 was US$45,230, while the average annual salary in computer and mathematical occupations was US$78,730. In a nation struggling to reduce unemployment, 150,000 computing-related jobs open annually.

Yet the number of US schools offering computer science courses is declining – from 78% in 2005 to 69% in 2011, according to the advocacy organization Computing in the Core (CinC), based in Washington, D.C. (USA). Only 14 states and the District of Columbia treat secondary-school computer science courses as core mathematics or science credits.

“Computer science is an essential skill; it’s fundamental knowledge for the 21st century and where the job growth is,” CinC founder Cameron Wilson said. “Yet many school districts still put it in the same ‘elective’ bucket as nonprofessional courses.”


Mandatory teaching of computer science education varies widely from country to country. China, the world’s most populous nation with close to 1.4 billion people, includes computer science as part of its curriculum, but wide disparities persist between rural and urban schools in how extensively the subject is taught. What’s more, some educators believe the way computer science is taught in China – emphasizing hierarchy, rote learning and copying past masters – stifles the very creativity essential to future competitiveness.

“China has shown the world it can manufacture anything; that’s what has driven its economy for 20 years,” said Dr. James A. Landay, who recently joined Cornell University’s NYC Tech campus in Manhattan, New York (USA) after 2.5 years of teaching in China. “But to move its huge population into a middle-class lifestyle, China can’t just manufacture products; it has to come up with innovative concepts and designs and market products as well as manufacture them. That’s where the money is, and in these areas China lags behind.”

In China, Landay worked as a visiting researcher at Microsoft Research Asia and taught computer science at Tsinghua University in Beijing. Computer Science & Technology attracts more students than any other undergraduate major in China, he said, but the curriculum is traditional, covering operating systems, networking and databases. It lacks cutting-edge courses in critical skills necessary for designing innovative software products, including computer-supported collaboration and user interfaces. The same is true in India, he observed. “China and India are like the US 25 years ago,” Landay said.

India is actively working to tackle the challenge, however. According to the 2011 report “Computing at School: International Comparisons” prepared by Computing at School (CaS) in collaboration with Microsoft and global researchers, India allowed each school to decide how to teach the subject through eighth grade. Starting in ninth grade, at age 14, students could take Computer Applications (the use of computing tools) and Computer Science (programming and algorithms). In June 2013, the Department of Computer Science and Engineering at the Indian Institute of Technology in Mumbai published a new, modernized and standardized computer science curriculum to be taught in Indian schools.

The CaS report outlines significant steps taken to promote computer science education in other areas of the world. In Germany, for example, computer science is not mandatory and cannot substitute for other science subjects, but the credits earned fully count toward graduation requirements. Scotland’s Curriculum for Excellence, which aims to shift teaching focus from facts to skills and competencies, includes computer science in the curriculum. New Zealand revamped its school curriculum in Digital Technologies to include an explicit strand entitled “Programming and computer science.” And in 2012, British Education Secretary Michael Gove moved to replace a “demotivating and dull” information and communications technology curriculum with a flexible one in computer science and programming.

“Instead of children bored out of their minds being taught how to use Word or Excel by bored teachers, we could have 11-year-olds able to write simple 2D computer animations,” BBC News quoted Gove saying.


The number of US schools  offering computer science  courses is declining – from  78% in 2005 to 69% in 2011.


Israel is widely regarded as the world leader in computer science education. Not surprisingly, Israel also has the world’s highest level of per-capita venture capital funding and the highest density of technology startups. In 1998, Israel’s Ministry of Higher Education implemented a secondary school computer science curriculum in which Israeli students may choose from tracks for casual or serious interest in computer science. Teachers of the subject receive specialized education and certification. Computer science education advocates worldwide consider Israel’s computer science education a model for the world. However, its successes may prove difficult to emulate in other countries. With fewer than 8 million people, Israel has a centralized education system. The US, by contrast, has a population of 316 million; its educational system is governed by a complex mix of federal, state and local authorities.

“There’s no single lever to pull to change the US education system,” said CinC’s Wilson, and the same situation exists to varying degrees in many other parts of the world. Wilson’s top recommendation to authorities at any level is to “clearly define and include K-12 (pre-college) computer science education as a critical part of education initiatives.”


After years of watching students avoid computer science or choose it only to drop out, Grant of the University of North Dakota is working with instructors worldwide to develop a coordinated global approach to computer science education. At the 2013 International Conference on Computer Science Education in Thailand, Grant conducted a workshop on developing a collaborative paradigm for software engineering curriculums worldwide.

Business is global, Grant said, but curriculum standards are not, making it difficult for companies to know exactly what skills to expect from the graduates they hire. “When you look at what to teach at the college level, you have to consider what kind of preparation those students bring from their primary and secondary schools,” Grant said. “Students need to understand that computer science isn’t just for geeks. It’s cool.”

Frequently, the discussion of global computer science education focuses on national competitiveness — as if one nation’s win is another’s loss. Landay and Grant take a different point of view. “Everyone creating the best products they can is good for all of us,” Landay said. “In the long run, people tend to specialize in what they do best. A rising tide lifts all boats.”

Rigorous measurement

Raytheon CEO achieves success through discipline

Tony Velocci

5 min read

Raytheon Company Chairman and CEO Bill Swanson, who is stepping down as CEO at the end of March, 2014, has developed a few simple rules during his 41 years in the aerospace and defense industry: Apply discipline to everything, model performance to ensure outcomes and – if you really want to keep your shareholders happy – always give customers more than they expect.

While viewing a news report on Operation Desert Storm 13 years ago, Raytheon Company Chairman and CEO Bill Swanson grew increasingly concerned as two American soldiers struggled to operate a shoulder-mounted weapon that appeared to be malfunctioning.

When Swanson recognized the equipment was Raytheon’s, he said, his heart skipped a beat. Although the equipment performed as predicted, he sought answers and discovered a potential gap in the training operators received in service. Swanson immediately reached out to the government customer to offer support and ensure training consistency so that no soldier would ever again be caught in such a predicament.

“The product had our name on it, and what I witnessed was absolutely unacceptable,” Swanson said. “Whatever equipment or service we deliver, there must be no doubt it will work."


Swanson, who is stepping down in March 2014 after more than a decade as CEO but will continue to serve as chairman of the Raytheon Board of Directors, applies the same rigorous process control to every functional area – not just manufacturing, but also financial systems, contracts administration, even human resources.

Early in his 41-year career, as Raytheon’s youngest plant manager, he observed the critical difference that common processes make in achieving operational excellence.

“Variation is costly and will work against you,” he said. “I wanted our products to be either 100% right or 100% wrong. Anything in the middle meant there would always be an element of doubt about whether the product should be released.” That all-or-nothing philosophy has become his hallmark.


A decade ago, Raytheon emerged as a survivor of the massive consolidation of US defense contractors triggered by the former Soviet Union’s collapse.

The mergers and acquisitions mashed together disparate management styles and processes. Integrating myriad business cultures into a single high-performing company took years.

In 2003, following his appointment as Raytheon’s chief executive, Swanson focused on creating a single culture bound by common processes. Implementing common systems took two years, but led to escalating levels of operating and financial performance.

Achieving process discipline across an organization is never easy. “Given a choice, every division president, plant manager, and product-line leader will want to do things their own way,” Swanson said. “Process discipline is the most underappreciated part of doing business, and that applies to all businesses, but the rewards are worth it.”

Craig Oxman, a managing director and global head of the Aerospace/Defense Group of Credit Suisse, applauds Swanson for transforming Raytheon from a company defined by variability to a cohesive and focused enterprise. “He has tremendous domain knowledge and a keen awareness of the direction in which his customers are going,” Oxman said. “These attributes are a distinct competitive advantage.”


Another lesson Swanson learned is that satisfying customers is the best way to deliver shareholder value. “Companies can sometimes pay too much attention to investors,” he said. “If your customers abandon you, you’re out of business.”

Swanson vowed that customer satisfaction would be the centerpiece of his business strategy, reasoning that if Raytheon delivered a superior customer experience, shareholder returns would follow. By Swanson’s standard, on-time delivery isn’t good enough. “Around here, our performance better exceed customer expectations,” he said.

Byron Callan, director of Capital Alpha Partners, a Washington, D.C.-based policy research firm for institutional investors, said Swanson brought an engineer’s mindset to the job of leading Raytheon, characterized by a strong focus on keeping shareholders happy while delivering operating performance. “It’s a tough balancing act that few CEOs have accomplished as well as Bill Swanson,” Callan said.

Just as top-performing companies have certain qualities in common, Swanson believes underperformers share common weaknesses. “There is a tendency to look for a ‘silver bullet,’ when the fundamentals are what is required and proven over time,” he said. “Rule Number One is ‘Listen to your customers,’ and Rule Number Two is, ‘Help make your customers successful.’”

In keeping with that philosophy, Swanson and his management team have virtually eliminated troubled programs and established an enviable reputation for reliability. Customer satisfaction has soared, and investor returns have too. According to business, financial and economic tracking company Bloomberg, Raytheon’s investor returns increased 195% between Dec. 31, 2002 and Sept. 30, 2013, versus 121.6% for the Standard & Poor’s 500.



“Raytheon’s margins are to-die-for in an industry that is very constrained by government rules and regulations,”
said Tom Captain, vice chairman and US Aerospace & Defense Leader for management consulting firm Deloitte.
“He is the gold standard for a CEO in today’s complex world.”

One of Swanson’s strengths, Captain said, is that he doesn’t chase management fads. “If you stick to your knitting and follow basic business and management principles you win, and that is exactly what Swanson has instilled across Raytheon,” Captain said.


Across the aerospace industry, Raytheon is commonly referred to as an engineering company, but Swanson wants Raytheon to be recognized for engineering and innovation, which requires corporatewide improvements by the entire Raytheon team working collaboratively. For example, it used to take the company 13 days to close its books; now it takes just two. “That’s the result of innovative thinking,” Swanson said.

Innovation can take many forms. Before becoming CEO, Swanson recognized that the priorities of Raytheon’s customer base – 90% of which are governments – increasingly focused on predictable outcomes and more affordable products.



Defense budgets run in boom-or-bust cycles. For companies like Raytheon, this time-honored pattern requires successful contractors to anticipate the macro forces that will shape their future.

Swanson responded as only an engineer’s engineer could, adopting modeling and simulation to more accurately predict outcomes. The result was Raytheon’s Model-Based Enterprise.

The innovation allows Raytheon’s customers to accurately visualize a product concept before it is engineered. Modeling and simulation require extensive information-sharing across an organization, but reduce business risks for the company and its customer. From that perspective, modeling and simulation may be the purest form of optimizing the “customer experience,” because Raytheon’s Model-Based Enterprise helps government customers visualize evolving requirements. ”It is a great tool for tackling complex problems of all kinds and helping your customer see what the future will look like,” Swanson said.


Swanson was so confident in the versatility of modeling and simulation that he conceived of applying it to another of his passions: science, technology, engineering and mathematics (STEM) education, to help education stakeholders make more informed decisions.

His inspiration for modeling education came from a meeting with the Business Higher Education Forum, the oldest US organization of senior business and higher education executives, which is dedicated to advancing solutions to US education and workforce challenges. The group had convened to discuss the alarming rate of decline in US student math and science test scores and to identify solutions. Swanson felt the experts in the room all had great suggestions; but when he asked them to identify the three Raytheon should focus on for maximum impact they couldn’t agree – an experience Swanson found both frustrating and inspirational.

Following the meeting, he marshaled Raytheon resources to create what became the U.S. STEM Education Model, an open-source tool for making STEM policy decisions and guiding practitioners. Top universities, including Stanford and the Massachusetts Institute of Technology, have since validated the model. The effort is part of Raytheon’s flagship STEM education program, which connects with students from elementary school through college, supports educators and policymakers, and promotes racial and gender equality to make a meaningful, long-term impact.

For Swanson, these efforts are a personal commitment. He was the first member of his family to go to college, which he considers a humbling responsibility. “I am first a product of my family and second a product of my schooling,” he says, “and so I have a great passion to give back.” 


Raytheon Company is a US$24-billion company specializing in defense, security and civil markets worldwide. It operates in 80 countries and employs 60,000 people.

The business began in 1922 in Cambridge, Massachusetts (USA), as the American Appliance Company. It was founded by Vannevar Bush, who became dean of the Massachusetts Institute of Technology (MIT) School of Engineering; Laurence Marshall, an engineer; and Charles G. Smith, a scientist. They invented the S gas rectifier, which transformed the radio into an affordable home appliance.

In the decades that followed, Raytheon built on its reputation for innovation. It stands today as a global technology leader.

Intelligent machines

The Internet of Things revolutionizes industrial-equipment services

Nick Lerner

3 min read

Steam, electricity and electronics powered the first three industrial revolutions. Now the Internet is driving Industry 4.0, which is revolutionizing industrial equipment and the services that support it with connected, intelligent machine capabilities that can earn profits of up to 30% for machine builders.

By 2020, 30 billion devices – from jet liners to packaging machines – will be wirelessly connected to the Internet, according to ABI Research, a market research and market intelligence firm based in New York City (USA). The trend has significant implications for the industrial-equipment industry; by building intelligence and connectivity into their machinery, original equipment manufacturers (OEMs) can innovate the services they offer, delivering superior customer experiences while generating new, high-profit revenues.

When a piece of industrial equipment is connected to the Internet, it can transmit information back to its builder on everything from wear to production patterns. Such information allows manufacturers to offer proactive services that help their customers avoid downtime and optimize production.

“In many cases, services are of greater importance than the product itself,” said Dieter Spath, former director of Germany’s Fraunhofer Institute for Industrial Engineering (IAO), who left to become the CEO of a high-tech company. Fraunhofer IAO unites research, science and business, Spath said. “Industry 4.0 is gaining momentum across the industrialized world as a new way to deploy the power of the Internet with cyber-physical products in the service of manufacturing.”

By harnessing this capability, industrial equipment manufacturers can supply more services to optimize their customers’ machines. “To remain competitive, companies are putting service at the front of their offering,” Spath said. “It is conceivable that products and equipment may even be given away or subsidized to create demand for more profitable services, so OEMs need to be innovative and agile in their service provision.”


Equipment that continually monitors its condition and reports that information back to its manufacturer also allows OEMs to learn more about how their customers use the machines.

“They can use those insights to provide their users with better machines and services,” Spath said. “For example, it will be possible to remotely oversee, control and schedule thousands of machines across the world from a single location. Machines will self-diagnose and auto-order parts within a connected system. New services will emerge that monitor, regulate and modernize machines and systems, releasing their latent value.”

30 billion

By 2020, ABI Research estimates that 30 billion devices will be wirelessly connected to the Internet.

Willy Shih, professor of Management Practice at Harvard Business School in Boston, Massachusetts (USA), adds that “accumulating large amounts of data from the shop floor is becoming easy and inexpensive. By applying analysis methods, data can be used to do a better job of optimizing production, fault-finding and cutting waste. New services will develop around releasing the value of that data to show machine owners how to improve equipment utilization and increase up-time and system performance to achieve higher levels of productivity.”

Manipulating bigger data sets allows producers to understand the cause and effect of productivity metrics in real time, Shih said. “We have never been able to see so much before. For example, in complex multi-step production, patterns of yield variation can be seen along with their causes. This helps isolate problems and allows companies to take immediate action to solve them. Because customers will pay to reduce downtime and improve throughput, opportunities exist for companies to provide services that collect, interpret and act on this data.”


Such advances in technology are “developing into fertile ground for machine makers to offer profitable new data-based services that make their customers more money,” said John Blyler, affiliate professor of Systems Engineering at Portland State University in Oregon (USA), and vice president and chief content officer at Extension Media, a high-tech publisher.

Blyler points to energy usage in production plants as a case where a connected system could make immediate and widespread improvements. “Companies that help machine users take advantage of cheaper energy could assist them in making less expensive products,” he said. “This means looking at production across a global, system-connected factory and spontaneously changing manufacturing locations and schedules, production layouts, machining cells and supply lines to save energy depending on the energy prices.”


In a recent research report titled “Embracing Digital Technology,” the Massachusetts Institute of Technology (MIT) Sloan Business School and CapGemini Consulting found that only 25% of the global business executives surveyed recognize the potential for digital transformation to launch new products and services, so those who move now will have an advantage.



The report cited General Electric (GE) as a company that has recognized the potential. GE is actively promoting a service strategy “that will help it tell customers how to schedule maintenance and avoid part failures, improving operations. The company expects it will sell services related to maintaining its products.” GE’s plan is just one example, the report found, that “the opportunity for digital technologies to create new businesses is real.”

Industrial-equipment manufacturers that fail to step up to the challenge, however, can expect even more difficult times ahead. Blyler predicts: “Ignoring this trend could prove disastrous for machine makers as valuable profit centers are eroded by more efficient rivals and newcomers with innovative service models keen to own this increasingly dynamic sector.”

Stay up to date

Receive monthly updates on content you won’t want to miss


Register here to receive a monthly update on our newest content.