In 1993/1994, at NASA’s Goddard Space Flight Center, Donald Becker and Thomas Sterling designed a Commodity Off The Shelf (COTS) supercomputer: Beowulf. Since they couldn’t afford a traditional supercomputer, they built a cluster computer made up of 16 Intel 486 DX4 processors, which were connected by channel bonded Ethernet. This Beowulf supercomputer was an instant success.
Source: Linux totally dominates supercomputers | ZDNet
Linux first appeared on the Top500 in 1998. Before Linux took the lead, Unix was supercomputing’s top operating system. Since 2003, the Top500 was on its way to Linux domination. By 2004, Linux had taken the lead for good.
In the area of supercomputing, Japan’s aim is to use ultra-fast calculations to accelerate advances in artificial intelligence (AI), such as “deep learning” technology that works off algorithms which mimic the human brain’s neural pathways, to help computers perform new tasks and analyze scores of data.
Source: Japan plans supercomputer to leap into technology future
The puzzle that required the 200-terabyte proof, called the Boolean Pythagorean triples problem, has eluded mathematicians for decades. In the 1980s, Graham offered a prize of US$100 for anyone who could solve it. (He duly presented the cheque to one of the three computer scientists, Marijn Heule of the University of Texas at Austin, earlier this month.) The problem asks whether it is possible to colour each positive integer either red or blue, so that no trio of integers a, b and c that satisfy Pythagoras’ famous equation a2 + b2 = c2 are all the same colour. For example, for the Pythagorean triple 3, 4 and 5, if 3 and 5 were coloured blue, 4 would have to be red.
Source: Two-hundred-terabyte maths proof is largest ever
There are more than 102,300 ways to colour the integers up to 7,825, but the researchers took advantage of symmetries and several techniques from number theory to reduce the total number of possibilities that the computer had to check to just under 1 trillion. It took the team about 2 days running 800 processors in parallel on the University of Texas’s Stampede supercomputer to zip through all the possibilities. The researchers then verified the proof using another computer program.
Baseball data, over 95% of which has been created over the last five years, will continue to mount—leading MLB decision-makers to invest in more powerful analytics tools. While there are plenty of business intelligence and database options, teams are now looking to supercomputing—or at least, the spawn of HPC—to help them gain the competitive edge.
via Inside Major League Baseball’s “Hypothesis Machine”.
Please. The problem with current baseball analytics isn’t the deluge of data, it’s the deluge of crackpot theories that add more and more irrelevant variables to the mix. Most baseball analytics misuse mathematics and created by people who are simply selling a website.
Speaking of selling a website; is this a good place to introduce the sister site to bucktownbell.com? 🙂
All data in above data model crunched using perl,awk, and bash on a standard PC. Baseball is not that complicated where it requires a supercomputer to crunch historical or current season data. More from the article…
He explained that what teams, just like governments and drug development researchers, are looking for is a “hypothesis machine” that will allow them to integrate multiple, deep data wells and pose several questions against the same data.
The records were set using the ROSS (Rensselaer’s Optimistic Simulation System) simulation package developed by Carothers and his students, and using the Time Warp synchronization algorithm originally developed by Jefferson.
“The significance of this demonstration is that direct simulation of ‘planetary scale’ models is now, in principle at least, within reach,” Barnes said. “‘Planetary scale’ in the context of the joint team’s work means simulations large enough to represent all 7 billion people in the world or the entire Internet’s few billion hosts.”
via RPI: News & Events – Rensselaer Polytechnic Institute and Lawrence Livermore Scientists Set a New Simulation Speed Record on the Sequoia Supercomputer.
Maybe they can get SimCity modeled correctly.
Q. When do you expect Mira to be up and running?
A. Our hope is that we will receive the machine sometime in the third quarter of 2012 or early in 2013. Then it will take us three or four months to stand it up. It’s made up of 48 racks that weigh 2 tons each, so it takes a while to wheel it in, put it in place and wire it up.
via Science Connections: Argonne’s Superstar Supercomputer — Evanston news, photos and events — TribLocal.com.
Q. What are some cool projects being run on the supercomputer right now?
A. There is a current study on concrete that is pretty exciting. Concrete production is a $100 billion a year industry in the U.S. and generates a lot of carbon dioxide. A researcher from the National Institute of Standards and Technology is using the computer to figure out how to design better concrete that produces less out-gassing.
Ouliang Chang floated his lunar supercomputer idea a few weeks ago at a space conference in Pasadena, California. The plan is to bury a massive machine in a deep dark crater, on the side of the moon that’s facing away from Earth and all of its electromagnetic chatter. Nuclear-powered, it would process data for space missions and slingshot Earth’s Deep Space Network into a brand new moon-centric era.
via Why We Need a Supercomputer on the Moon | Wired Enterprise | Wired.com.
Clearly, the business of dreaming up supercomputers in space is not for those who think small.
The team is wanting to see “Iridis-Pi” become an inspiration for students to enable them “to apply high-performance computing and data handling techniques to tackle complex engineering and scientific challenges.”
via Engineers Build Supercomputer Using Raspberry Pi, Lego – ParityNews.com: …Because Technology Matters.
Howto is here.
Steps to make a Raspberry Pi Supercomputer
The steps to make a Raspberry Pi supercomputer can be downloaded here: Raspberry Pi Supercomputer (PDF).
You can also follow the steps here Raspberry Pi Supercomputer (html).
The Top 10 Supercomputers, Illustrated, Nov. 2011 » Data Center Knowledge.
The twice-a-year list of the Top 500 supercomputers documents the most powerful systems on the planet. Many of these supercomputers are striking not just for their processing power, but for their design and appearance as well. Here’s a look at the top finishers in the latest Top 500 list, which was released Monday, November 15, 2011 at the SC11 conference in Seattle.
Super computer porn.