A Forum for discussing emerging smart discoveries and emerging technologies with built-in intelligence or embedded smarts, as well as the new cognitive skills needed to succeed in the smart economy. The Smart Future is already here, just the last page hasn't been written yet! Every advance brings benefits as well as intrusions.
Have your say !! Read, enjoy, explore, speculate, comment !!
Even a seemingly simple decision like whether to call heads or tails in a coin toss requires complex calculations by our brains. A new study in Nature Neuroscience has identified a region of the brain involved in decision-making. Researchers at the University of British Columbia newly identified a small region known as the lateral habenula, which had previously been linked to depression, that appears to be crucial to making cost-benefit calculations. The scientists trained a group of rats to choose between a small reward that was achieved consistently (one food pellet) and a larger reward (four pellets) that was obtained only occasionally. If the larger reward was received frequently enough, it would make sense to hold out for the four pellets; otherwise, it was better to go with the smaller reward. But when the researchers inactivated the lateral habenula, the rats were no longer able to make these calculations. Instead, they chose between the large and small rewards seemingly at random. The lateral habenula is one of the regions targeted by deep brain stimulation (DBS), and the researchers say the results of this study hint that DBS may not work to make people happier but to decrease the amount that they care about what makes them depressed.
Read more: http://bit.ly/InlmlV Journal article: What's better for me? Fundamental role for lateral habenula in promoting subjective decision biases. Nature Neuroscience, 2013. doi:10.1038/nn.3587 Image credit: Freddy The Boy/Flickr
A UT Arlington assistant engineering professor has developed a computational model that can more accurately predict when an epileptic seizure will occur next based on the patient’s personalized medical information.
Shouyi Wang, assistant professor of Industrial and Manufacturing Systems Engineering
Wang’s model analyzes electroencephalography, or EEG, readings from an individual, to predict future seizures. Early warnings could lead a patient to use medicine to combat an oncoming seizure, he said.
“The challenge with seizure prediction has been that every epileptic is different. Some patients suffer several seizures a day. Others will go several years without experiencing a seizure,” Wang said. “But if we use the EEG readings to build a personalized data profile, we’re better able to understand what’s happening to that person.”
Epilepsy is one of the most common neurological disorders, characterized by recurrent seizures. Epilepsy and seizures affect nearly 3 million Americans at an estimated annual cost of $17.6 billion in direct and indirect costs, according to the national Epilepsy Foundation, About 10 percent of the American population will experience a seizure in their lifetime, the agency says.
Wang teamed with Wanpracha Art Chaovalitwongse of the University of Washington and Stephen Wong of the Rutgers Robert Wood Johnson Medical School for the research.
Wang said early indications are that the new computational model could provide 70 percent accuracy or better and give a prediction horizon of about 30 minutes before the actual seizure would occur.
The current model collects data through a cap embedded with EEG wires. Wang’s team is working to develop a less obtrusive EEG cap that will record and transmit readings to a box for easy data download or transmission.
Victoria Chen, professor and chairwoman of the Industrial and Manufacturing Systems Engineering Department, said Wang’s work in the area of bioinformatics offers hope for the many people who suffer from epilepsy.
“This computational model might be used to predict other life-threatening episodes of diseases,” Chen said.
Wang said his model builds upon an adaptive learning framework and is capable of achieving more and more accurate prediction performance for each individual patientby collecting more and more personalized medical data.
“As a society, we’ve gotten really good at looking at the big picture,” Wang said. “We can tell you the likelihood of suffering a heart attack if you’re over a certain age, of a certain weight and if you smoke. But we have only started to personalize that data for individuals who are all different.”
Wang’s work is representative of excellence at The University of Texas at Arlington, a comprehensive research institution of more than 33,300 students and 2,200 faculty members in the epicenter of North Texas. It is the second largest institution in The University of Texas System. Total research expenditures reached almost $78 million last year. Visit www.uta.edu to learn more.
Consumers love a sale. In fact, when asked what makes a sale appealing, most simply say, "The price was good." But this answer fails to acknowledge that subjective factors also contribute to the perceived value of a deal. According to new research published in the Journal of Consumer Research, it's possible to increase the perception of a good deal.
"We find that the more a consumer relies on the original price when trying to determine a product's worth, the more valuable they perceive the deal to be," write authors Christina Kan, Donald R. Lichtenstein (both University of Colorado), Susan Jung Grant (Boston University), and Chris Janiszewski (University of Florida). "If a retailer can get a consumer to pay more attention to a $179 original list price, and less attention to a $99 sale price, when assessing the worth of a winter jacket, then the $99 sale price will seem like a better deal."
The study research summarizes three situations in which list prices have more influence on the estimated worth of a product and, by extension, the perceived value of the deal. In three different experiments, the authors reveal that when a consumer focuses on competing product similarities, they are more likely to consider all of the available information when judging the worth of a product. That is, both the original list price and the sale price are used to determine the perceived worth of the product. In contrast, when a consumer focuses on product dissimilarities, the consumer is more likely to consider only the sale price when determining the subjective value of the product.
"This research provides insights for both retailers and consumers. Retailers can make a sales event more effective by encouraging the consumer to rely on the original price when assessing both the value of the product and the value of the deal. Additionally, by comparing product prices at competing retailers, consumers can lessen the impact of the original price on their assessment of the products' overall worth," the authors conclude.
Christina Kan, Donald R. Lichtenstein, Susan Jung Grant, and Chris Janiszewski. "Strengthening the Influence of Advertised Reference Prices through Information Priming." Journal of Consumer Research: April 2014. For more information, contact Christina Kan (Christina.Kan@Colorado.edu) or visit http://ejcr.org/.
Unexpected behavior in ferroelectric materials explored by researchers at Oak Ridge National Laboratory supports a new approach to information storage and processing known as memcomputing. (hi-res image)
OAK RIDGE, Tenn., Nov. 18, 2013—Unexpected behavior in ferroelectric materials explored by researchers at the Department of Energy’s Oak Ridge National Laboratory supports a new approach to information storage and processing.
Ferroelectric materials are known for their ability to spontaneously switch polarization when an electric field is applied. Using a scanning probe microscope, the ORNL-led team took advantage of this property to draw areas of switched polarization called domains on the surface of a ferroelectric material. To the researchers’ surprise, when written in dense arrays, the domains began forming complex and unpredictable patterns on the material’s surface.
“When we reduced the distance between domains, we started to see things that should have been completely impossible,” said ORNL’s Anton Ievlev, the first author on the paper published in Nature Physics. “All of a sudden, when we tried to draw a domain, it wouldn’t form, or it would form in an alternating pattern like a checkerboard. At first glance, it didn’t make any sense. We thought that when a domain forms, it forms. It shouldn’t be dependent on surrounding domains.”
After studying patterns of domain formation under varying conditions, the researchers realized the complex behavior could be explained through chaos theory. One domain would suppress the creation of a second domain nearby but facilitate the formation of one farther away -- a precondition of chaotic behavior, says ORNL’s Sergei Kalinin, who led the study.
“Chaotic behavior is generally realized in time, not in space,” he said. ”An example is a dripping faucet: sometimes the droplets fall in a regular pattern, sometimes not, but it is a time-dependent process. To see chaotic behavior realized in space, as in our experiment, is highly unusual.”
Collaborator Yuriy Pershin of the University of South Carolina explains that the team’s system possesses key characteristics needed for memcomputing, an emergent computing paradigm in which information storage and processing occur on the same physical platform.
“Memcomputing is basically how the human brain operates: Neurons and their connections--synapses--can store and process information in the same location,” Pershin said. “This experiment with ferroelectric domains demonstrates the possibility of memcomputing.”
Encoding information in the domain radius could allow researchers to create logic operations on a surface of ferroelectric material, thereby combining the locations of information storage and processing.
The researchers note that although the system in principle has a universal computing ability, much more work is required to design a commercially attractive all-electronic computing device based on the domain interaction effect.
“These studies also make us rethink the role of surface and electrochemical phenomena in ferroelectric materials, since the domain interactions are directly traced to the behavior of surface screening charges liberated during electrochemical reaction coupled to the switching process,” Kalinin said.
The study is published as “Intermittency, quasiperiodicity, and chaos during scanning probe microscopy tip-induced ferroelectric domain switching,” and is available online. Coauthors are ORNL’s Stephen Jesse, Evgheni Strelcov, Sergei Kalinin and Amit Kumar; the National Academy of Sciences of Ukraine’s Anna Morozovska and Eugene Eliseev; the University of South Carolina’s Yuriy Pershin; and Ural Federal University’s Vladimir Shur. Ievlev, formerly of Ural Federal University, has joined ORNL as a postdoctoral fellow.
Part of this research was conducted at the Center for Nanophase Materials Sciences, which is sponsored at ORNL by the Scientific User Facilities Division in DOE’s Office of Basic Energy Sciences. CNMS is one of the five DOE Nanoscale Science Research Centers supported by the DOE Office of Science, premier national user facilities for interdisciplinary research at the nanoscale. Together the NSRCs comprise a suite of complementary facilities that provide researchers with state-of-the-art capabilities to fabricate, process, characterize and model nanoscale materials, and constitute the largest infrastructure investment of the National Nanotechnology Initiative. The NSRCs are located at DOE's Argonne, Brookhaven, Lawrence Berkeley, Oak Ridge and Sandia and Los Alamos national laboratories. For more information about the DOE NSRCs, please visit http://science.energy.gov/bes/suf/user-facilities/nanoscale-science-research-centers/.
ORNL is managed by UT-Battelle for the Department of Energy's Office of Science. DOE's Office of Science is the single largest supporter of basic research in the physical sciences in the United States, and is working to address some of the most pressing challenges of our time. For more information, please visit http://science.energy.gov.
Research in recent years has suggested that young Americans might be less creative now than in decades past, even while their intelligence — as measured by IQ tests — continues to rise.
But new research from the University of Washington Information School and Harvard University, closely studying 20 years of student creative writing and visual artworks, hints that the dynamics of creativity may not break down as simply as that.
Instead, it may be that some aspects of creativity — such as those employed in visual arts — are gently rising over the years, while other aspects, such as the nuances of creative writing, could be declining.
The paper will be published in Creativity Research Journal in January 2014. The lead author is Emily Weinstein, a doctoral student in the Harvard Graduate School of Education.
Katie Davis, UW assistant professor, and fellow researchers studied 354 examples of visual art and 50 examples of creative writing by teenagers published between 1990 and 2011. The question they pursued, Davis said, was “How have the style, content and form of adolescents’ art-making and creative writing changed over the last 20 years?”
The artwork came from a monthly magazine for teens, the writing from a similar annual publication featuring student fiction. The researchers analyzed and coded the works, blind as to year, looking for trends over that time.
The review of student visual art showed an increase in the sophistication and complexity both in the designs and the subject matter over the years. The pieces, Davis said, seemed “more finished, and fuller, with backgrounds more fully rendered, suggesting greater complexity.” Standard pen-and-ink illustrations grew less common over the period studied, while a broader range of mixed media work was represented.
Conversely, the review of student writing showed the young authors adhering more to “conventional writing practices” and a trend toward less play with genre, more mundane narratives and simpler language over the two decades studied.
Still, Davis said, it’s too simple to just say creativity increased in one area and decreased in another over the years.
“There really isn’t a standard set of agreed-upon criteria to measure something as complex and subjective as creativity,” she said. “But there are markers of creativity — like complexity and risk-taking and breaking away from the standard mold — that appear to have changed.”
The researchers also note that the period of study was a time of great innovation in digital art, with new tools for creative production and boundless examples of fine art a mere click or two away, serving to inform and inspire the students in their own work.
Davis said that while previous research has typically studied creativity in a lab setting, this work examined student creative work in a more “naturalistic” setting, where it is found in everyday life.
She added that with data from such a naturalistic setting, researchers cede a degree of control over the characteristics of the sample being studied, and the findings cannot safely be generalized to all American youth.
“It remains an open question as to whether the entire U.S. has seen a decline in literary creativity and a parallel increase in visual creativity among its youth over the last 20 years,” Davis said. “Because society — indeed any society — depends on the creativity of its citizens to flourish, this is a question that warrants serious attention in future creativity research.”
The paper’s other co-authors are Zachary Clark and Donna DiBartolomeo, former graduate students at Harvard.
The findings are also discussed in Davis’ recent book with Howard Gardner, “The App Generation.” The research was funded by the James and Judith K. Dimon Foundation.
TORONTO, ON – A new study from the University of Toronto Scarborough shows that people who are aware of and their own thoughts and emotions are less affected by positive feedback from others.
The study, authored by UTSC PhD candidate Rimma Teper, finds that individuals high in trait mindfulness show less neural response to positive feedback than their less mindful peers.
”These findings suggest that mindful individuals may be less affected by immediate rewards and fits well with the idea that mindful individuals are typically less impulsive” says Teper.
Trait mindfulness is characterized by an ability to recognize and accept one’s thoughts and emotions without judgment. Mindful individuals are much better at letting their feelings and thoughts go rather than getting carried away.
Using electroencephalography (EEG) the brain activity of participants was recorded while they completed a reaction time task on a computer. The authors were interested in participants’ brain activity in response to receiving performance feedback that was rewarding, neutral or negative in nature. Not only were mindful individuals less responsive to rewarding feedback compared to others, they also showed less difference in their neural response to neutral versus rewarding feedback.
The findings also reflect further clinical research that supports the notion of accepting one’s emotions is an important indicator of mental well-being.
“Individuals who are problem gamblers for instance show more brain reactivity to immediate rewards, because they are typically more impulsive,” says Teper.
“Many studies, including our own past work, have shown that people who meditate, and mindful individuals exhibit improved self-control. If mindful individuals are also less affected by immediate rewards, as our study suggests, this may help explain why,” says Teper’s PhD supervisor and UTSC psychology professor Michael Inzlicht.
The research was published this week in the journal Emotion.
For more information contact:
Rimma Teper University of Toronto Scarborough Department of Psychology 416-648-3843 firstname.lastname@example.org
To us humans, all tail wags look alike. But new research published in Current Biology shows that dogs can differentiate between a left-wagging tail and one that wags to the right. Previous work by scientists at the University of Trento in Italy showed that dogs wag to different sides based on their current emotional state (wags to the right indicate happiness, wags to the left show anxiety and similar emotions). In their latest study, these researchers showed that dogs can distinguish between these two types of wags. When a dog was shown a video clip of a left-wagging dog, their heart rates increased and they showed signs of anxiety. When the dog watched a video of a right-wagging dog, the animal stayed totally relaxed. The scientists don't believe the dogs are deliberately communicating via tail wags. Instead, they believe that it's a reflection of their asymmetrically organized brains, and other dogs have learned to interpret these signals.
Read more: http://bit.ly/16OI0PF Journal article: Seeing left or right asymmetric tail wagging produces different emotional responses in dogs. Current Biology, 2013 DOI: 10.1016/j.cub.2013.09.027 Image credit: ssetaro/Flickr
First of its kind, brain-inspired device looks toward highly efficient and fast parallel computing
Cambridge, Mass. – November 1, 2013 – It doesn't take a Watson to realize that even the world's best supercomputers are staggeringly inefficient and energy-intensive machines.
Our brains have upwards of 86 billion neurons, connected by synapses that not only complete myriad logic circuits; they continuously adapt to stimuli, strengthening some connections while weakening others. We call that process learning, and it enables the kind of rapid, highly efficient computational processes that put Siri and Blue Gene to shame.
Materials scientists at the Harvard School of Engineering and Applied Sciences (SEAS) have now created a new type of transistor that mimics the behavior of a synapse. The novel device simultaneously modulates the flow of information in a circuit and physically adapts to changing signals.
Exploiting unusual properties in modern materials, the synaptic transistor could mark the beginning of a new kind of artificial intelligence: one embedded not in smart algorithms but in the very architecture of a computer. The findings appear in Nature Communications.
“There’s extraordinary interest in building energy-efficient electronics these days,” says principal investigator Shriram Ramanathan, associate professor of materials science at Harvard SEAS. “Historically, people have been focused on speed, but with speed comes the penalty of power dissipation. With electronics becoming more and more powerful and ubiquitous, you could have a huge impact by cutting down the amount of energy they consume.”
The human mind, for all its phenomenal computing power, runs on roughly 20 Watts of energy (less than a household light bulb), so it offers a natural model for engineers.
“The transistor we’ve demonstrated is really an analog to the synapse in our brains,” says co-lead author Jian Shi, a postdoctoral fellow at SEAS. “Each time a neuron initiates an action and another neuron reacts, the synapse between them increases the strength of its connection. And the faster the neurons spike each time, the stronger the synaptic connection. Essentially, it memorizes the action between the neurons.”
In principle, a system integrating millions of tiny synaptic transistors and neuron terminals could take parallel computing into a new era of ultra-efficient high performance.
Several prototypes of the synaptic transistor are visible on this silicon chip. (Photo by Eliza Grinnell, SEAS Communications.)
While calcium ions and receptors effect a change in a biological synapse, the artificial version achieves the same plasticity with oxygen ions. When a voltage is applied, these ions slip in and out of the crystal lattice of a very thin (80-nanometer) film of samarium nickelate, which acts as the synapse channel between two platinum "axon" and "dendrite" terminals. The varying concentration of ions in the nickelate raises or lowers its conductance—that is, its ability to carry information on an electrical current—and, just as in a natural synapse, the strength of the connection depends on the time delay in the electrical signal.
Structurally, the device consists of the nickelate semiconductor sandwiched between two platinum electrodes and adjacent to a small pocket of ionic liquid. An external circuit multiplexer converts the time delay into a magnitude of voltage which it applies to the ionic liquid, creating an electric field that either drives ions into the nickelate or removes them. The entire device, just a few hundred microns long, is embedded in a silicon chip.
The synaptic transistor offers several immediate advantages over traditional silicon transistors. For a start, it is not restricted to the binary system of ones and zeros.
“This system changes its conductance in an analog way, continuously, as the composition of the material changes,” explains Shi. “It would be rather challenging to use CMOS, the traditional circuit technology, to imitate a synapse, because real biological synapses have a practically unlimited number of possible states—not just ‘on’ or ‘off.’”
The synaptic transistor offers another advantage: non-volatile memory, which means even when power is interrupted, the device remembers its state.
Additionally, the new transistor is inherently energy efficient. The nickelate belongs to an unusual class of materials, called correlated electron systems, that can undergo an insulator-metal transition. At a certain temperature—or, in this case, when exposed to an external field—the conductance of the material suddenly changes.
“We exploit the extreme sensitivity of this material,” says Ramanathan. “A very small excitation allows you to get a large signal, so the input energy required to drive this switching is potentially very small. That could translate into a large boost for energy efficiency.”
The nickelate system is also well positioned for seamless integration into existing silicon-based systems.
“In this paper, we demonstrate high-temperature operation, but the beauty of this type of a device is that the 'learning' behavior is more or less temperature insensitive, and that’s a big advantage,” says Ramanathan. “We can operate this anywhere from about room temperature up to at least 160 degrees Celsius.”
For now, the limitations relate to the challenges of synthesizing a relatively unexplored material system, and to the size of the device, which affects its speed.
“In our proof-of-concept device, the time constant is really set by our experimental geometry,” says Ramanathan. “In other words, to really make a super-fas device, all you’d have to do is confine the liquid and position the gate electrode closer to it.”
In fact, Ramanathan and his research team are already planning, with microfluidics experts at SEAS, to investigate the possibilities and limits for this “ultimate fluidic transistor.”
He also has a seed grant from the National Academy of Sciences to explore the integration of synaptic transistors into bioinspired circuits, with L. Mahadevan, Lola England de Valpine Professor of Applied Mathematics, professor of organismic and evolutionary biology, and professor of physics.
“In the SEAS setting it’s very exciting; we’re able to collaborate easily with people from very diverse interests,” Ramanathan says.
For the materials scientist, as much curiosity derives from exploring the capabilities of correlated oxides (like the nickelate used in this study) as from the possible applications.
“You have to build new instrumentation to be able to synthesize these new materials, but once you’re able to do that, you really have a completely new material system whose properties are virtually unexplored,” Ramanathan says. “It’s very exciting to have such materials to work with, where very little is known about them and you have an opportunity to build knowledge from scratch.”
“This kind of proof-of-concept demonstration carries that work into the ‘applied’ world,” he adds, “where you can really translate these exotic electronic properties into compelling, state-of-the-art devices.”
This research was supported by the National Science Foundation (NSF), the Army Research Office’s Multidisciplinary University Research Initiative, and the Air Force Office of Scientific Research. The team also benefited from the facilities at the Harvard Center for Nanoscale Systems, a member of the NSF-supported National Nanotechnology Infrastructure Network. Sieu D. Ha, a postdoctoral fellow at SEAS, was the co-lead author; additional coauthors included graduate student You Zhou and Frank Schoofs, a former postdoctoral fellow.
A new study published in Science found that a gene known as SRPX2, which previously has been linked to epilepsy and speech disorders, also influences language development and synapse formation. Researchers at Johns Hopkins University were initially looking for genes that helped promote the formation of new synapses. They found that when SRPX2 was upregulated, it led to the rapid formation of many new synapses. But when the researchers blocked SRPX2 in fetal mice, they found the mice were unable to make their normal distress calls when separated from their mother, indicating that the lack of this gene led to problems with the mouse equivalent of language development. Transgenic mice carrying a mutated human SRPX2 showed similar problems. The precise links between SRPX2, synapse formation, and language development will require further study, scientists say.
Read more: http://bit.ly/1at5Y3P Journal article: The Human Language–Associated Gene SRPX2 Regulates Synapse Formation and Vocalization in Mice. Science, 2013 DOI: 10.1126/science.1245079 Image credit: Wellcome Images
Neuroscientists Can Quickly, Accurately Measure Creativity
Thu, 10/31/2013 - 7:00am
Michigan State Univ.
While some believe creativity is spontaneous, Michigan State Univ. neuroscientist Jeremy Gray suspects there’s a lot of hard work going on in the brain even when the proverbial light bulb going off feels effortless. Image: Michigan State Univ.A team of researchers led by a Michigan State Univ. neuroscientist has created a quick but reliable test that can measure a person’s creativity from single spoken words.
The “noun-verb” test is so simple it can be done by virtually anyone anywhere – even in an MRI machine, setting the stage for scientists to pinpoint how the brain comes up with unusually creative ideas.
While some believe ingenuity is spontaneous, MSU neuroscientist Jeremy Gray suspects there’s a lot of hard work going on in the brain even when the proverbial light bulb going off feels effortless. The findings from his latest research are published in the journal Behavior Research Methods.
“We want to understand what makes creativity tick, what the specific processes are in the brain,” Gray says. “Innovation doesn’t just come for free – nobody learns their ABCs in kindergarten and suddenly writes a great novel or poem, for example. People need to master their craft before they can start to be creative in interesting ways.”
For his latest research, 193 participants were shown a series of nouns and instructed to respond creatively with a verb in each case. The test took about two minutes.
For the noun “chair,” for example, instead of answering with the standard verb “sit,” a participant might answer “stand,” as in to stand on a chair to change a light bulb. The researchers checked that the answers were in fact verbs and somehow related to the noun; excluding the few nonsensical responses made no difference to the results.
The participants also were measured for creativity through a series of more in-depth methods including story writing, drawing and their creative achievements in real life.
The results: those who gave creative answers in the noun-verb test were indeed the most creative as measured by the more in-depth methods. This suggests the noun-verb test, or a future variation, could be successful by itself in measuring creativity.
Currently, Gray and his team are having participants complete the noun-verb test in an MRI while their brain activity is recorded, in hopes of identifying parts of the brain responsible for creativity. This test is more feasible in an MRI than, say, writing stories or drawing pictures since the machine requires people to remain virtually still.
Although much more research is needed, the findings eventually could help students, entrepreneurs, scientists and others who depend on innovative thinking.
“Ultimately, this work could allow us to create better educational and training programs to help people foster their creativity,” Gray says.
The research also could be helpful in settings where selecting creative people is important, such as the human resources office, he says.