Previously scientists at IBM and other institutes have successfully demonstrated the ability to store 1 bit per cell in PCM, but today at the IEEE International Memory Workshop in Paris, IBM scientists are presenting, for the first time, successfully storing 3 bits per cell in a 64k-cell array at elevated temperatures and after 1 million endurance cycles.
“Phase change memory is the first instantiation of a universal memory with properties of both DRAM and flash, thus answering one of the grand challenges of our industry,” said Dr. Haris Pozidis, an author of the paper and the manager of non-volatile memory research at IBM Research – Zurich. “Reaching 3 bits per cell is a significant milestone because at this density the cost of PCM will be significantly less than DRAM and closer to flash.”
Exactly how deep is the Patent Office’s cloud expertise, anyway?
Is it as deep as its touch screen expertise, which led to its award of all those patents to Apple on the iPhone, even though prior art seemed to indicate Apple didn’t invent very many of the touch screen’s features? I hope the Patent Office will do better by the cloud in terms of keeping it out of one vendor’s hands.
Quantum computers must overcome the challenge of detecting and correcting quantum errors before they can fulfill their promise of sifting through millions of possible solutions much faster than classical computers.
Detecting quantum errors is anything but straightforward. Classical computers can detect and correct their bit-flip errors by simply copying the same bit many times and taking the correct value from the majority of error-free bits. By comparison, the fragility of quantum states in qubits means that trying to directly copy them can have the counterproductive effect of changing the quantum state.
These real-time applications, according to Donna Dillenberger, a distinguished engineer at IBM’s Watson lab, can be done in a mainframe environment. They are not yet possible on clusters of smaller, industry-standard computers, she said. But there are several open-source software projects, like Apache Spark, that focus on real-time data processing across large numbers of computers.
He estimates the total cost of ownership including hardware, software and labor will be 50 percent less with a mainframe than on his “sprawling server farm,” given the growing complexity of managing hardware and software from several suppliers.
The plan calls for IBM will resell Apple devices with its software pre-installed. IBM activation, management and security software are also involved in the deal. The partnership aims to give Apple the credibility it still has not quite achieved in IT departments and bring IBM into a popular mobile ecosystem.
IBM’s revenues are declining because there’s a big shift going on in the way companies are buying tech. Instead of buying their own software and hardware for their own data centers, then hiring expensive consultants to stitch it all together, they are renting that technology, which is often hosted elsewhere. That’s called “cloud computing.”
All the big tech firms are shifting from the old way of selling stuff to this new way with varying degrees of success: SAP, Oracle, Microsoft, Dell, HP and IBM are all getting into the cloud.
Google continues to top the search game with the mission of “organiz[ing] the world’s information and mak[ing] it universally accessible and useful.” But now this mission is limited given how rapidly artificial intelligence has pushed the boundaries of what’s possible. It’s raised expectations of what we expect from computers. Even Siri has. In that mindset, Google is basically a gigantic database with rich access and retrieval mechanisms without the ability to create new knowledge.
In other words: Google can retrieve, but Watson can create.
Once the oxide materials, which are innately insulating, are transformed into a conducting state, the IBM experiments showed that the materials maintain a stable metallic state even when power to the device is removed. This non-volatile property means that chips using devices that operate using this novel phenomenon could be used to store and transport data in a more efficient, event-driven manner instead of requiring the state of the devices to be maintained by constant electrical currents.
Big data spans four dimensions: Volume, Velocity, Variety, and Veracity.
Big data is more than simply a matter of size; it is an opportunity to find insights in new and emerging types of data and content, to make your business more agile, and to answer questions that were previously considered beyond your reach
The new work opens up the prospect of studying imperfections in the “wonder material” graphene or plotting where electrons go during chemical reactions.
The images are published in Science.
They are carried out at a scale so small that room temperature induces wigglings of the AFM’s constituent molecules that would blur the images, so the apparatus is kept at a cool -268C.
While some improvements have been made since that first image of pentacene, lead author of the Science study, Leo Gross, told BBC News that the new work was mostly down to a choice of subject.