2007-02-15

Building the Cortex in Silicon

An ambitious project to model the cerebral cortex in silicon is under way at Stanford. The man-made brain could help scientists understand how the most recently evolved part of our brain performs its complex computational feats, allowing us to understand language, recognize faces, and schedule the day. It could also lead to new neural prosthetics.

"Brains do things in technically and conceptually novel ways--they can solve rather effortlessly issues which we cannot yet resolve with the largest and most modern digital machines," says Rodney Douglas, a professor at the Institute of Neuroinformatics, in Zurich. "One of the ways to explore this is to develop hardware that goes in the same direction."

Neurons communicate with a series of electrical pulses; chemical signals transiently change the electrical properties of individual cells, which in turn trigger an electrical change in the next neuron in the circuit. In the 1980s, Carver Mead, a pioneer in microelectronics at the California Institute of Technology, realized that the same transistors used to build computer chips could be used to build circuits that mimicked the electrical properties of neurons. Since then, scientists and engineers have been using these transistor-based neurons to build more-complicated neural circuits, modeling the retina, the cochlea (the part of the inner ear that translates sound waves into neural signals), and the hippocampus (a part of the brain crucial for memory). They call the process neuromorphing.

2007-02-11

Mimicking how the brain recognizes street scenes

At last, neuroscience is having an impact on computer science and artificial intelligence (AI). For the first time, scientists in Tomaso Poggio's laboratory at the McGovern Institute for Brain Research at MIT applied a computational model of how the brain processes visual information to a complex, real world task: recognizing the objects in a busy street scene. The researchers were pleasantly surprised at the power of this new approach.

"People have been talking about computers imitating the brain for a long time," said Poggio, who is also the Eugene McDermott Professor in the Department of Brain and Cognitive Sciences and the co-director of the Center for Biological and Computational Learning at MIT. "That was Alan Turing's original motivation in the 1940s. But in the last 50 years, computer science and AI have developed independently of neuroscience. Our work is biologically inspired computer science."

"We developed a model of the visual system that was meant to be useful for neuroscientists in designing and interpreting experiments, but that also could be used for computer science," said Thomas Serre, a former PhD student and now a post-doctoral researcher in Poggio's lab and lead author a paper about the street scene application in the 2007 IEEE Transactions on Pattern Analysis and Machine Intelligence. "We chose street scene recognition as an example because it has a restricted set of object categories, and it has practical social applications."

http://www.eurekalert.org/pub_releases/2007-02/mifb-mht020607.php

Quantum computer to debut next Tuesday?

Remember where you were when you heard about Steorn? Us neither. (Yet.) Kind of the same with D-Wave, which, as you may recall, claims to be the first and only "commercial" quantum computing venture; despite a low hanging cloud of skeptical academics, D-Wave is claiming next Tuesday it'll finally debut the first quantum computer: a 16 qubit processor capable of 64,000 simultaneous calculations in quantum space(s). What's a qubit? Why, it's the quantum computer measurement equivalent of a conventional computer's bit (i.e. more (qu)bits = more data and processes), but we're not even going to insult your intelligence by pretending to understand how a many-hundreds qubit quantum computer could supposedly solve more operations than the universe has atoms. We just know that a quantum computer has yet to be built, has the potential to revolutionize the way we understand and use computation -- and with any luck D-Wave's supposed machine will be promptly put to work analyzing weather patterns so we'll know the exact climate this time next year and not buy the wrong things when this year's fall lines come out. That is, if it doesn't open up a black hole, or something.

http://www.engadget.com/2007/02/08/quantum-computer-to-debut-next-tuesday/

2007-02-07

Why Windows is less secure than Linux

Windows is inherently harder to secure than Linux. There I said it. The simple truth.

Many millions of words have been written and said on this topic. I have a couple of pictures. The basic argument goes like this. In its long evolution, Windows has grown so complicated that it is harder to secure. Well these images make the point very well. Both images are a complete map of the system calls that occur when a web server serves up a single page of html with a single picture. The same page and picture. A system call is an opportunity to address memory. A hacker investigates each memory access to see if it is vulnerable to a buffer overflow attack. The developer must do QA on each of these entry points. The more system calls, the greater potential for vulnerability, the more effort needed to create secure applications.

The first picture is of the system calls that occur on a Linux server running Apache.SysCallApachesmall.jpg

This second image is of a Windows Server running IIS.

SysCallIISsmall.jpg

http://blogs.zdnet.com/threatchaos/?p=311

powered by performancing firefox