February 08, 2016

Will Artificial Intelligence have an Uber Effect on Finance? -

Yea, but "On the other hand, the front office is an area where Sutton believed that human interaction is necessary as finance is an industry that is very relationship centric as people leverage financial advisors and wealth managers to provide customised advice."

This from 

Will Artificial Intelligence have an Uber Effect on Finance?

Following the recent breakthrough of artificial intelligence (AI), many have been wondering how this form of technology can be implemented in the financial services. As newer products emerge, it questions how popular the traditional legacy financial institutions will remain or perhaps, fintech startups will gain an increased number of customers, in a similar way to how Uber affected the taxi industry. bobsguide spoke to Josh Sutton, global head of artificial intelligence practice at technology company Sapient about how AI is set to transform business and finance, alongside how banks are already implementing a technology that has been around for 30+ years, but its true potential hasn't been seen until now.

According to CNBC, nearly $700 million has been invested in artificial intelligence over the past two years and Sutton explored how it is important to work with the C-suite of a company to give them a roadmap of the capabilities AI has as it provides a way to increase revenue, reduce cost and minimise risk. "Increased investment in AI has been over 30 years coming and technology has caught up to the conceptual promise of what could be done. If you look at all the products deployed by machine learning today, these are not new concepts by any means, but the processing power of the machines has finally reached a point where it is cost effective and time effective enough to generate real results from that information," Sutton highlighted.

Sutton continued to explain how AI has been extensively used by government and academic institutions, but banks have started to use it in order to monitor their risk related to illegal insider trading activities. A large global bank has already implemented AI instead of using the historical approach of having a team review trade information and police it in a human manner, Sutton revealed. "The platform that they built combined big data, machine learning and causal intelligence and that aggregates all the trade data and communication data from various traders and people they interact with across the various divisions."

Alongside this, artificial intelligence will benefit different parts of an organisation in different ways. Sutton said that leveraging AI would "systematically accelerate certain portions of the core middle and back offices to automate everything from trade processing through to KYC and AML." This ties into the long standing debate that has been occurring over the past year about whether human workers will be needed if technology becomes increasingly sophisticated. The stage that we are at the moment is that there needs to be a mixture of tasks completed by people and the rest by machine learning, but Sutton explored how the number of people required to fulfil the function of the middle and back office will eliminate the need for people.

"I think there will always be a need for people to identify and review the high priority activities but I do think that a substantial amount of work that is done today that is relatively trainable can be replaced via technology over the coming decade," Sutton said. On the other hand, the front office is an area where Sutton believed that human interaction is necessary as finance is an industry that is very relationship centric as people leverage financial advisors and wealth managers to provide customised advice. "I do believe that artificial intelligence will enable financial advisors to be much more effective in their interactions so, if you look at the job of a financial advisor, a significant portion of their time goes to understanding their individual customers, what is going on in their lives and what advice they can provide."

"What you'll see in the traditional wealth group, financial advisors will be able to take on a greater number of clients and the entire industry will expand as it becomes a cost effective tool that people can have that they haven't traditionally. If you look at a good disruptive example, like Uber, the model has changed the way that the industry works and it has dramatically increased the amount of money that gets spent." Sutton predicts that this "Uber Effect" will occur with artificial intelligence and the financial industry, especially in the retail banking industry where there will be a blur between retail banks and wealth managers.

"I think what you're starting to see is a lot of fintech players trying to nibble around the edges of that," Sutton highlighted as he went on to say that artificial intelligence will be ubiquitous in our day to day life, so much so that you are not even aware that you are using AI. However, to get to this point, there are many obstacles that must be overcome, one which concerns how the financial services industry are focusing on big data when it comes to implementing AI, rather than seeing it as a business tool.

"Artificial intelligence is not a technology solution, it is a business solution."

January 27, 2016

The Low-Down: #WhatsApp Nears a Billion Users: Is It Finally Time to Make Money?

Given the number of competing services adding users when you're free is not exactly easy, but it's a different challenge when you actually have to make money.

WhatsApp's global reach gives it some advantages. Whether it can figure out how to capitalize on them is another question. That said, being owned by Facebook, a company that also endured skepticism when it made that transition, suggests that it will get lots of experienced help. JL

Cade Metz reports in Wired

WhatsApp has a greater global reach than nearly any other app. A lot of companies are global. And these companies may be willing to embrace this kind of messaging because WhatsApp gives them more efficient access to more people than any other medium. The company can refine this kind of communication—and eventually charge a fee for it.
Read the rest of the article online here :  The Low-Down: WhatsApp Nears a Billion Users: Is It Finally Time to Make Money?

January 24, 2016

#Fintech investments will continue

Deconstructing the recent financial technology boom raises intriguing questions about the sustainability of the innovation cycle.

Fintech and the Fear of Froth

For more than half a decade, a seemingly irresistible momentum has been building around the idea that finance and technology are converging at a historical inflection point, a combination of business transformation and competitive disruption that has come to be labeled fintech. With annual investments in product development and entrepreneurial ventures now well into double-digit billions of dollars — and climbing — a 2000-style crash apparently isn’t in the offing.
However, if fintech is booming, then it is not immune to cyclical decline. Any investment, whether in a strategic initiative or a start-up, carries risk. But might there be more secular, or macro, forces that could wreak havoc on fintech as a whole? Or, to spin it more positively, what will it take to ensure this emerging sector’s longer-term sustainability?
There may be no better barometer of the fintech climate than what the money people are saying. The venture capitalists, investment bankers and others spotlighted last November in Institutional Investor’s inaugural Fintech Finance 35 ranking were unanimously enthusiastic and optimistic — but with murmurs of concern about too much froth. 
Alan Freudenstein, co-head of the Credit Suisse NEXT Fund, observed that some deals were “pushing prices to ridiculous levels.” Hans Morris, managing partner of Nyca Partners, cited overvaluations as a reason to be cautious about the blockchain boomlet.
Reacting to the mid-November announcement that Japanese e-commerce giant Rakuten had become the umpteenth corporation to launch a fintech investment strategy, Michael Maxworthy, a partner at M&A boutique Marlin & Associates, mused, “When will the madness end?”
In another everybody’s-doing-it example, Chicago is the latest major city to aspire to be a “fintech hub,” one of the local business community’s ChicagoNext initiatives.
Everyone wants a piece of fintech, but what exactly is it? The challenges and risks may lie in the fact that the concept is vague and undifferentiated. Some observers are taking a step back to define — or redefine — fintech. It is not one thing.
The basic premise is dialectic — a collision of old and new, incumbents and upstarts, legacy and disruption — along with a growing consensus that the opposites can profitably coexist.
Banks and other diversified financial institutions still have advantages in terms of customer databases and operational scale, but their lack of “fintech DNA” and “gaps in execution” leave them vulnerable to newer, agile, less regulated companies that are “digital first and can do one thing, but smarter,” says Senthil Kumar, vice president of marketing at Oracle Financial Services. 
That essentially describes how one of the early fintech disruptions, peer-to-peer and online lending, has played out over eight to ten years. Now Lending Club and OnDeck Capital, among other maturing newcomers, are forming partnerships with the likes of Citi and JPMorgan Chase & Co.
“The greatest opportunity lies at the meeting point of large financial institutions and young, ambitious start-ups,” Andrew Veitch, director of Anthemis Group, said last June upon the release of a “Fintech 2.0” paper that the London-based investment firm co-authored with Oliver Wyman Group and Banco Santander’s Santander InnoVentures.
Alexei Miller, a managing partner at technology development and consulting firm DataArt, deconstructs fintech three ways: general advancements like high-performance computing and mobility that make an impact on finance; entrepreneurs in Silicon Valley and elsewhere who set out to disrupt specific aspects of the business; and innovation led by established players. He says the last category is often overlooked but is yielding benefits increasingly through collaboration, as in Depository Trust & Clearing Corp.’s customer-data-aggregation affiliate, Clarient, and Goldman Sachs Group’s trading technology spin-off, REDI Holdings. 
John Dwyer, senior analyst at Oliver Wyman-affiliated research firm Celent, says attitudes toward “generic fintech” have been shaped by high-profile successes in online lending and electronic payments. He is considering a more detailed taxonomy of subsectors, including technology of regulation and compliance, which he dubs regtech; crypto tech, encompassing alternative currencies and blockchain; cybersecurity; insurance tech; and capital markets.
Seen in this light, fintech has legs, each of its facets evolving at a different pace. The capital markets category includes “the biggest [markets] on the planet,” Dwyer says: “As a fintech market, it is still in need of much greater definition.”
Get more on trading and technology

Fintech and the Fear of Froth

-- The MasterFeeds

December 19, 2015

The science myths that will not die :

Once a myth has been created, it's hard to kill it off.

The science myths that will not die

Illustration by Ryan Snook
In 1997, physicians in southwest Korea began to offer ultrasound screening for early detection of thyroid cancer. News of the programme spread, and soon physicians around the region began to offer the service. Eventually it went nationwide, piggybacking on a government initiative to screen for other cancers. Hundreds of thousands took the test for just US$30–50.


James Harkin, a researcher for the British TV trivia show QI, talks to Adam Levy about how he finds facts and myths for the show — and then runs a mini-quiz to see whether the Podcast team can discern science fact from science fiction
Across the country, detection of thyroid cancer soared, from 5 cases per 100,000 people in 1999 to 70 per 100,000 in 2011. Two-thirds of those diagnosed had their thyroid glands removed and were placed on lifelong drug regimens, both of which carry risks.
Such a costly and extensive public-health programme might be expected to save lives. But this one did not. Thyroid cancer is now the most common type of cancer diagnosed in South Korea, but the number of people who die from it has remained exactly the same — about 1 per 100,000. Even when some physicians in Korea realized this, and suggested that thyroid screening be stopped in 2014, the Korean Thyroid Association, a professional society of endocrinologists and thyroid surgeons, argued that screening and treatment were basic human rights.
In Korea, as elsewhere, the idea that the early detection of any cancer saves lives had become an unshakeable belief.
This blind faith in cancer screening is an example of how ideas about human biology and behaviour can persist among people — including scientists — even though the scientific evidence shows the concepts to be false. “Scientists think they're too objective to believe in something as folklore-ish as a myth,” says Nicholas Spitzer, director of the Kavli Institute for Brain and Mind at the University of California, San Diego. Yet they do.
These myths often blossom from a seed of a fact — early detection does save lives for some cancers — and thrive on human desires or anxieties, such as a fear of death. But they can do harm by, for instance, driving people to pursue unnecessary treatment or spend money on unproven products. They can also derail or forestall promising research by distracting scientists or monopolizing funding. And dispelling them is tricky.
Scientists should work to discredit myths, but they also have a responsibility to try to prevent new ones from arising, says Paul Howard-Jones, who studies neuroscience and education at the University of Bristol, UK. “We need to look deeper to understand how they come about in the first place and why they're so prevalent and persistent.”
Some dangerous myths get plenty of air time: vaccines cause autism, HIV doesn't cause AIDS. But many others swirl about, too, harming people, sucking up money, muddying the scientific enterprise — or simply getting on scientists' nerves. Here, Nature looks at the origins and repercussions of five myths that refuse to die.

Myth 1: Screening saves lives for all types of cancer

Regular screening might be beneficial for some groups at risk of certain cancers, such as lung, cervical and colon, but this isn't the case for all tests. Still, some patients and clinicians defend the ineffective ones fiercely.
The belief that early detection saves lives originated in the early twentieth century, when doctors realized that they got the best outcomes when tumours were identified and treated just after the onset of symptoms. The next logical leap was to assume that the earlier a tumour was found, the better the chance of survival. “We've all been taught, since we were at our mother's knee, the way to deal with cancer is to find it early and cut it out,” says Otis Brawley, chief medical officer for the American Cancer Society.
But evidence from large randomized trials for cancers such as thyroid, prostate and breast has shown that early screening is not the lifesaver it is often advertised as. For example, a Cochrane review of five randomized controlled clinical trials totalling 341,342 participants found that screening did not significantly decrease deaths due to prostate cancer.
“People seem to imagine the mere fact that you found a cancer so-called early must be a benefit. But that isn't so at all,” says Anthony Miller at the University of Toronto in Canada. Miller headed the Canadian National Breast Screening Study, a 25-year study of 89,835 women aged 40–59 years old that found that annual mammograms did not reduce mortality from breast cancer. That's because some tumours will lead to death irrespective of when they are detected and treated. Meanwhile, aggressive early screening has a slew of negative health effects. Many cancers grow slowly and will do no harm if left alone, so people end up having unnecessary thyroidectomies, mastectomies and prostatectomies. So on a population level, the benefits (lives saved) do not outweigh the risks (lives lost or interrupted by unnecessary treatment).
Still, individuals who have had a cancer detected and then removed are likely to feel that their life was saved, and these personal experiences help to keep the misconception alive. And oncologists routinely debate what ages and other risk factors would benefit from regular screening.
Focusing so much attention on the current screening tests comes at a cost for cancer research, says Brawley. “In breast cancer, we've spent so much time arguing about age 40 versus age 50 and not about the fact that we need a better test,” such as one that could detect fast-growing rather than slow-growing tumours. And existing diagnostics should be rigorously tested to prove that they actually save lives, says epidemiologist John Ioannidis of the Stanford Prevention Research Center in California, who this year reported that very few screening tests for 19 major diseases actually reduced mortality.
Changing behaviours will be tough. Gilbert Welch at the Dartmouth Institute for Health Policy and Clinical Practice in Lebanon, New Hampshire, says that individuals would rather be told to get a quick test every few years than be told to eat well and exercise to prevent cancer. “Screening has become an easy way for both doctor and patient to think they are doing something good for their health, but their risk of cancer hasn't changed at all.”
Illustration by Ryan Snook

Myth 2: Antioxidants are good and free radicals are bad

In December 1945, chemist Denham Harman's wife suggested that he read an article in Ladies' Home Journal entitled 'Tomorrow You May Be Younger'. It sparked his interest in ageing, and years later, as a research associate at the University of California, Berkeley, Harman had a thought “out of the blue”, as he later recalled. Ageing, he proposed, is caused by free radicals, reactive molecules that build up in the body as by-products of metabolism and lead to cellular damage.
Scientists rallied around the free-radical theory of ageing, including the corollary that antioxidants, molecules that neutralize free radicals, are good for human health. By the 1990s, many people were taking antioxidant supplements, such as vitamin C and β-carotene. It is “one of the few scientific theories to have reached the public: gravity, relativity and that free radicals cause ageing, so one needs to have antioxidants”, says Siegfried Hekimi, a biologist at McGill University in Montreal, Canada.
Yet in the early 2000s, scientists trying to build on the theory encountered bewildering results: mice genetically engineered to overproduce free radicals lived just as long as normal mice, and those engineered to overproduce antioxidants didn't live any longer than normal. It was the first of an onslaught of negative data, which initially proved difficult to publish. The free-radical theory “was like some sort of creature we were trying to kill. We kept firing bullets into it, and it just wouldn't die,” says David Gems at University College London, who started to publish his own negative results in 2003 (ref. 6). Then, one study in humans showed that antioxidant supplements prevent the health-promoting effects of exercise, and another associated them with higher mortality.
None of those results has slowed the global antioxidant market, which ranges from food and beverages to livestock feed additives. It is projected to grow from US$2.1 billion in 2013 to $3.1 billion in 2020. “It's a massive racket,” says Gems. “The reason the notion of oxidation and ageing hangs around is because it is perpetuated by people making money out of it.”
Today, most researchers working on ageing agree that free radicals can cause cellular damage, but that this seems to be a normal part of the body's reaction to stress. Still, the field has wasted time and resources as a result. And the idea still holds back publications on possible benefits of free radicals, says Michael Ristow, a metabolism researcher at the Swiss Federal Institute of Technology in Zurich, Switzerland. “There is a significant body of evidence sitting in drawers and hard drives that supports this concept, but people aren't putting it out,” he says. “It's still a major problem.”
Some researchers also question the broader assumption that molecular damage of any kind causes ageing. “There's a question mark about whether really the whole thing should be chucked out,” says Gems. The trouble, he says, is that “people don't know where to go now”.
Illustration by Ryan Snook

Myth 3: Humans have exceptionally large brains

The human brain — with its remarkable cognition — is often considered to be the pinnacle of brain evolution. That dominance is often attributed to the brain's exceptionally large size in comparison to the body, as well as its density of neurons and supporting cells, called glia.
None of that, however, is true. “We cherry-pick the numbers that put us on top,” says Lori Marino, a neuroscientist at Emory University in Atlanta, Georgia. Human brains are about seven times larger than one might expect relative to similarly sized animals. But mice and dolphins have about the same proportions, and some birds have a larger ratio.
“Human brains respect the rules of scaling. We have a scaled-up primate brain,” says Chet Sherwood, a biological anthropologist at George Washington University in Washington DC. Even cell counts have been inflated: articles, reviews and textbooks often state that the human brain has 100 billion neurons. More accurate measures suggest that the number is closer to 86 billion. That may sound like a rounding error, but 14 billion neurons is roughly the equivalent of two macaque brains.
Human brains are different from those of other primates in other ways: Homo sapiens evolved an expanded cerebral cortex — the part of the brain involved in functions such as thought and language — and unique changes in neural structure and function in other areas of the brain.
The myth that our brains are unique because of an exceptional number of neurons has done a disservice to neuroscience because other possible differences are rarely investigated, says Sherwood, pointing to the examples of energy metabolism, rates of brain-cell development and long-range connectivity of neurons. “These are all places where you can find human differences, and they seem to be relatively unconnected to total numbers of neurons,” he says.
The field is starting to explore these topics. Projects such as the US National Institutes of Health's Human Connectome Project and the Swiss Federal Institute of Technology in Lausanne's Blue Brain Project are now working to understand brain function through wiring patterns rather than size.

Myth 4: Individuals learn best when taught in their preferred learning style

People attribute other mythical qualities to their unexceptionally large brains. One such myth is that individuals learn best when they are taught in the way they prefer to learn. A verbal learner, for example, supposedly learns best through oral instructions, whereas a visual learner absorbs information most effectively through graphics and other diagrams.
There are two truths at the core of this myth: many people have a preference for how they receive information, and evidence suggests that teachers achieve the best educational outcomes when they present information in multiple sensory modes. Couple that with people's desire to learn and be considered unique, and conditions are ripe for myth-making.
“Learning styles has got it all going for it: a seed of fact, emotional biases and wishful thinking,” says Howard-Jones. Yet just like sugar, pornography and television, “what you prefer is not always good for you or right for you,” says Paul Kirschner, an educational psychologist at the Open University of the Netherlands.
In 2008, four cognitive neuroscientists reviewed the scientific evidence for and against learning styles. Only a few studies had rigorously put the ideas to the test and most of those that did showed that teaching in a person's preferred style had no beneficial effect on his or her learning. “The contrast between the enormous popularity of the learning-styles approach within education and the lack of credible evidence for its utility is, in our opinion, striking and disturbing,” the authors of one study wrote.
That hasn't stopped a lucrative industry from pumping out books and tests for some 71 proposed learning styles. Scientists, too, perpetuate the myth, citing learning styles in more than 360 papers during the past 5 years. “There are groups of researchers who still adhere to the idea, especially folks who developed questionnaires and surveys for categorizing people. They have a strong vested interest,” says Richard Mayer, an educational psychologist at the University of California, Santa Barbara.
In the past few decades, research into educational techniques has started to show that there are interventions that do improve learning, including getting students to summarize or explain concepts to themselves. And it seems almost all individuals, barring those with learning disabilities, learn best from a mixture of words and graphics, rather than either alone.
Yet the learning-styles myth makes it difficult to get these evidence-backed concepts into classrooms. When Howard-Jones speaks to teachers to dispel the learning-styles myth, for example, they often don't like to hear what he has to say. “They have disillusioned faces. Teachers invested hope, time and effort in these ideas,” he says. “After that, they lose interest in the idea that science can support learning and teaching.”
Illustration by Ryan Snook

Myth 5: The human population is growing exponentially (and we're doomed)

Fears about overpopulation began with Reverend Thomas Malthus in 1798, who predicted that unchecked exponential population growth would lead to famine and poverty.
But the human population has not and is not growing exponentially and is unlikely to do so, says Joel Cohen, a populations researcher at the Rockefeller University in New York City. The world’s population is now growing at just half the rate it was before 1965. Today there are an estimated 7.3 billion people, and that is projected to reach 9.7 billion by 2050. Yet beliefs that the rate of population growth will lead to some doomsday scenario have been continually perpetuated. Celebrated physicist Albert Bartlett, for example, gave more than 1,742 lectures on exponential human population growth and the dire consequences starting in 1969.
The world's population also has enough to eat. According to the Food and Agriculture Organization of the United Nations, the rate of global food production outstrips the growth of the population. People grow enough calories in cereals alone to feed between 10 billion and 12 billion people. Yet hunger and malnutrition persist worldwide. This is because about 55% of the food grown is divided between feeding cattle, making fuel and other materials or going to waste, says Cohen. And what remains is not evenly distributed — the rich have plenty, the poor have little. Likewise, water is not scarce on a global scale, even though 1.2 billion people live in areas where it is.
“Overpopulation is really not overpopulation. It's a question about poverty,” says Nicholas Eberstadt, a demographer at the American Enterprise Institute, a conservative think tank based in Washington DC. Yet instead of examining why poverty exists and how to sustainably support a growing population, he says, social scientists and biologists talk past each other, debating definitions and causes of overpopulation.
Cohen adds that “even people who know the facts use it as an excuse not to pay attention to the problems we have right now”, pointing to the example of economic systems that favour the wealthy.
Like others interviewed for this article, Cohen is less than optimistic about the chances of dispelling the idea of overpopulation and other ubiquitous myths (see ‘Myths that persist’), but he agrees that it is worthwhile to try to prevent future misconceptions. Many myths have emerged after one researcher extrapolated beyond the narrow conclusions of another's work, as was the case for free radicals. That “interpretation creep”, as Spitzer calls it, can lead to misconceptions that are hard to excise. To prevent that, “we can make sure an extrapolation is justified, that we're not going beyond the data”, suggests Spitzer. Beyond that, it comes down to communication, says Howard-Jones. Scientists need to be effective at communicating ideas and get away from simple, boiled-down messages.

Myths that persist

Nature polled doctors and scientists for the medical myths that they find most frustrating. Here's what turned up.
Vaccines cause autism
Although there are some risks associated with vaccines, the connection to neurological disorders has been debunked many times over.
Paracetamol (acetaminophen) works through known mechanisms
Although it is widely used, there are only hints as to how it and other common drugs actually work.
The brain is walled off from the immune system
The brain has its own immune cells, and a lymphatic system that connects the brain to the body's immune system has recently been discovered.
Homeopathy works.
It doesn't.
Once a myth is here, it is often here to stay. Psychological studies suggest that the very act of attempting to dispel a myth leads to stronger attachment to it. In one experiment, exposure to pro-vaccination messages reduced parents' intention to vaccinate their children in the United States. In another, correcting misleading claims from politicians increased false beliefs among those who already held them. “Myths are almost impossible to eradicate,” says Kirschner. “The more you disprove it, often the more hard core it becomes.”

The science myths that will not die : Nature News & Comment

December 18, 2015

Novel Engineering project teaches kids about engineering by using fiction books @TuftsCEEO

The goal of Novel Engineering is to bolster reading comprehension through hands-on projects while teaching students the engineering process and linking it to the human (or at least canine) problems it helps fix.

A Novel Way to Teach Kids About Engineering

LEGO enginineering elemetery school.
At Linden STEAM Academy in Malden, Massachusetts, third graders build a Lego prototype as part of the Novel Engineering initiative, which mixes engineering with literacy in elementary school classrooms.
Photo by Chris Berdik
One recent morning, at a public school in Malden, Massachusetts, north of Boston, teams of third graders rushed around programming sensors on computers, wiring them to motors, digging into bins of Legos and gears, and rummaging through boxes of paper towel rolls, egg cartons, and pipe cleaners. Their mission was to protect a baby turtle from a dog—a beloved, mischievous black Lab named Tornado, the title character of a novel they had read for class.
The fictional dog is named after a twister that flung him into the life of a young farm boy. During one of their adventures, the thirsty pup drinks from a pet turtle’s watery home and slurps up the creature in the process. Hence the turtle-rescue project for these third graders at the Linden STEAM Academy (STEM plus Art). It’s part of an initiative called Novel Engineering led by researchers at nearby Tufts University, in which engineering challenges are plucked from the plots of assigned books. The elementary school lesson plan, developed at Tuft’s Center for Engineering Education and Outreach, is backed by the National Science Foundation.
In America’s push for STEAM education, engineering is at the heart of the acronym, but it’s largely missing from elementary school classrooms. The goal of Novel Engineering is to bolster reading comprehension through hands-on projects while teaching students the engineering process and linking it to the human (or at least canine) problems it helps fix.
In the five years since Novel Engineering began, the CEEO team and partner universities have taught the approach to about 150 teachers around the country. They have also stocked an online repository with sample projects and a list of books, by grade level, that have mixed well with engineering in the past, including Judy Blume’s Tales of a Fourth Grade Nothing, and Roald Dahl’s James and the Giant Peach.
CEEO’s director, Merredith Portsmore, said they work with teachers to choose a good fit from the year’s assigned reading. Some books fit the initiative better than others. For instance, Portsmore noted, “engineering doesn’t really work in fantasy books, like Harry Potter. Because, if you have a magic wand, why would you need engineering?”
“Our goal is to make it easy for teachers, who are sometimes under a lot of pressure,” said Portsmore.
Rather than reducing books to engineering “design briefs,” Portsmore said, Novel Engineering teachers discuss all the challenges characters face, as they normally would, “and then ask which of these problems can we solve with engineering and which ones won’t work for that.”
The scarcity of engineering in grade schools not only slows the supply of home-grown engineers, it hurts their skills, according to Morgan Hynes, an engineering professor who helped lead Novel Engineering before leaving Tufts for Purdue in 2013. That’s because the crucial human pieces of engineering—learning the end-users’ needs and tendencies and working collaboratively to solve problems—can get lost in the shuffle when “real engineering” is postponed until students have mastered advanced math and physics.
“When we ask industry professionals what skills our engineering graduates are missing, they say it’s the social, communication and interpersonal skills,” said Hynes.
The earlier that students are exposed to the human, problem-solving piece of engineering, Hynes argues, the more motivated they’ll be for the technical preparation later. But, elementary school teachers aren’t typically trained or accountable for engineering lessons, and wedging them into a packed school day can be tough.
Teachers at Linden STEAM Academy, which began a partnership with CEEO this year, are fortunate in this regard. Massachusetts is one of only four states with “comprehensive” K-12 engineering learning standards, according to a 2015 report by Purdue researchers. Plus, STEAM Academy teachers get blocks of time for “project-based units” of science and engineering.
“Engineering is embraced at our school,” said Deborah Smith, who teaches the third graders working to safeguard the turtle. “It’s just become part of our day.”
She said Novel Engineering has also spurred writing practice, because students must write and revise descriptions of the problem they’re working on and the design ideas they have to address it. There’s more writing at the end, when students are often asked to write a letter to their character explaining the invention, or to rewrite a scene from the book that incorporates their new idea.
Smith’s class read Tornado weeks before building their prototypes, as part of a larger unit on extreme weather. In a subsequent class, the students jotted down the characters’ challenges and brainstormed solutions. They settled on the turtle problem and then spent another class sketching, writing, and revising their engineering ideas.
Finally, they were ready to prototype—bridges, revolving doors, even retractable covers for the turtle’s home operated by motors linked to sensors to detect the reptile’s approach. From their reading, the students knew the dog was clever, curious and determined, but they also knew he obeyed his master, so one group rigged a dog-level sensor to the turtle habitat that triggered a loud recorded scold, “Stop! Go Away!”
Interestingly, Novel Engineering just started mixing technologies, such as Lego robotics and littleBits circuits, into their approach about 18 months ago. Initially, and still today, they focused on low-tech projects using recyclables, found objects, and craft supplies readily available in most schools. However, students kept dreaming up engineering solutions with motors, sensors, sounds, and other functionality that lent themselves to technology tools CEEO had used in other initiatives. So, the Novel Engineering team has been piloting a high-tech version at the STEAM Academy, while putting together a book of worksheets and sample projects.
Portsmore stresses that Novel Engineering is adaptable to high- and low-tech classrooms. Some schools have a lot of technology in place, or can afford to buy it, but many can’t. Even at the Linden STEAM Academy, Tufts researchers co-teaching the pilot classes come bearing totes full of sensors and other electronics, plus a few extra laptops, because the classrooms have only a handful of computers.
“The big thing in our professional development is to make sure teachers understand the engineering process, and how to help students scope problems and critique each other’s ideas,” Portsmore said. “They can scale that process in different ways, depending on what resources they have. It can be cardboard, recyclables, and duct tape, or these new technologies.”
Going high-tech does change what stories that will work in the curriculum, she said, because solutions should make sense for the book’s setting. Infrared sensors and robots can’t help characters in ancient Egypt or colonial America, for instance.
And the details of setting and characters really matter to students in the engineering process, said Smith. They will often refer back to the text to ensure their solution is appropriate. “It’s student-run,” Smith said. “They come up with the ideas. They’re in charge of the outcome. And that drives engagement.”
This story was produced by The Hechinger Report, a nonprofit, independent news organization focused on inequality and innovation in education. Read more about Blended Learning.
Future Tense is a collaboration among Arizona State UniversityNew America, and Slate. Future Tense explores the ways emerging technologies affect society, policy, and culture. To read more, visit the Future Tense blog and the Future Tense home page. You can also follow us on Twitter.

Novel Engineering project teaches kids about engineering by using fiction books.

Subscribe to The MasterTech's Feeds

Add This