Showing posts with label processor. Show all posts
Showing posts with label processor. Show all posts

Thursday, 29 August 2013

MIT develops 110-core processor for more power-efficient computing

A 110-core chip has been developed by Massachusetts Institute of Technology as it looks for power-efficient ways to boost performance in mobile devices, PCs and servers.

The processor, called the Execution Migration Machine, tries to determine ways to reduce traffic inside chips, which enables faster and more power-efficient computing, said Mieszko Lis, a postgraduate student and Ph.D. candidate at MIT, during a presentation at the Hot Chips conference in California.

[ Keep up on the day's tech news headlines with InfoWorld's Today's Headlines: Wrap Up newsletter. ]

The chip is a general purpose processor and not an accelerator like a graphics processor, Lis said, adding that it was an experimental chip.

"It's not the kind of thing you buy for Christmas," Lis said.

Typically a lot of data migration takes place between cores and cache, and the 110-core chip has replaced the cache with a shared memory pool, which reduces the data transfer channels. The chip is also able to predict data movement trends, which reduces the number of cycles required to transfer and process data.

The benefits of power-efficient data transfers could apply to mobile devices and databases, Lis said on the sidelines of the conference.

For example, data-traffic reduction will help mobile devices efficiently process applications like video, while saving power. It could also help reduce the amount of data sent by a mobile device over a network.

Fewer threads and predictive data behavior could help speed up databases. It could also free up shared resources for other tasks, Lis said.

The researchers have seen up to 14 times the reduction in on-chip traffic, which significantly reduces power dissipation. According to internal benchmarks, the performance was 25 percent better compared to other processors, Lis said. Lis did not specify the competitive processors used for benchmarks.

The chip has a mesh architecture with the 110 cores interconnected in a square design. It is based on custom architecture designed to deal with large data sets and to make data migration easier, Lis said. The code was also written specially to work with the processor.

Top chip makers have moved away from adding cores, topping out at between 12 and 16 cores in processors. But the MIT researchers crammed 110 cores in the 10 millimeter by 10 millimeter size of the chip, Lis said. The chip was made using the 45-nanometer process.

The mesh architecture is also used in chips from Tilera, which can scale up to 100 cores. But Lis said the 110-core chip is not based on Tilera's architecture, nor is it a successor.

Agam Shah covers PCs, tablets, servers, chips and semiconductors for IDG News Service. Follow Agam on Twitter at @agamsh. Agam's e-mail address is agam_shah@idg.com.

Correction: This story as originally posted misnamed the processor. The article has been amended.


View the original article here

Wednesday, 28 August 2013

MIT develops 110-core processor for more power-efficient computing

A 110-core chip has been developed by Massachusetts Institute of Technology as it looks for power-efficient ways to boost performance in mobile devices, PCs and servers.

The processor, called the Execution Migraine Machine, tries to determine ways to reduce traffic inside chips, which enables faster and more power-efficient computing, said Mieszko Lis, a postgraduate student and Ph.D. candidate at MIT, during a presentation at the Hot Chips conference in California.

The chip is a general purpose processor and not an accelerator like a graphics processor, Lis said, adding that it was an experimental chip.

"It's not the kind of thing you buy for Christmas," Lis said.

Typically a lot of data migration takes place between cores and cache, and the 110-core chip has replaced the cache with a shared memory pool, which reduces the data transfer channels. The chip is also able to predict data movement trends, which reduces the number of cycles required to transfer and process data.

The benefits of power-efficient data transfers could apply to mobile devices and databases, Lis said on the sidelines of the conference.

For example, data-traffic reduction will help mobile devices efficiently process applications like video, while saving power. It could also help reduce the amount of data sent by a mobile device over a network.

Fewer threads and predictive data behavior could help speed up databases. It could also free up shared resources for other tasks, Lis said.

The researchers have seen up to 14 times the reduction in on-chip traffic, which significantly reduces power dissipation. According to internal benchmarks, the performance was 25 percent better compared to other processors, Lis said. Lis did not specify the competitive processors used for benchmarks.

The chip has a mesh architecture with the 110 cores interconnected in a square design. It is based on custom architecture designed to deal with large data sets and to make data migration easier, Lis said. The code was also written specially to work with the processor.

Top chip makers have moved away from adding cores, topping out at between 12 and 16 cores in processors. But the MIT researchers crammed 110 cores in the 10 millimeter by 10 millimeter size of the chip, Lis said. The chip was made using the 45-nanometer process.

The mesh architecture is also used in chips from Tilera, which can scale up to 100 cores. But Lis said the 110-core chip is not based on Tilera's architecture, nor is it a successor.

Agam Shah covers PCs, tablets, servers, chips and semiconductors for IDG News Service. Follow Agam on Twitter at @agamsh. Agam's e-mail address is agam_shah@idg.com

Agam Shah is a reporter for the IDG News Service in New York. He covers hardware including PCs, servers, tablets, chips, semiconductors, consumer electronics and peripherals.
More by Agam Shah, IDG News Service


View the original article here

Monday, 26 August 2013

Intel expands custom processor business

Intel is putting a sharp focus on expanding its custom processor and chip operations in response to a growing trend of companies building servers in-house to meet specific workloads or data center designs.

Last year, Intel has supplied custom processors to 18 companies, most notably eBay and Facebook, said Diane Bryant, senior vice president and general manager of Intel's Datacenter and Connected Systems Group, in an interview.

"That trend is growing. In the last year we've delivered 18 custom silicon processor solutions for the full array of customers -- our direct customers, the OEMS and the end users -- in order to meet their specific needs," said Bryant, who runs Intel's most profitable group.

The growing custom processor business will supplement Intel's bread and butter server chip business, consisting of selling generic Xeon processors available in rack, tower and blade servers sold by companies like Dell and Hewlett-Packard. But server infrastructures are changing with the growing adoption of cloud computing, big data and other applications, which has translated into a growing demand for custom processors, Bryant said.

Companies like Facebook and Google with mega data centers design servers in-house, and get them made from direct server suppliers like Quanta. The barebones servers typically cut excess components and are good enough to handle the growing cloud transactions such as search requests and social networking tasks. Facebook and Google have experimented with ARM processors in servers, while Tilera processors have also been tested.

"When you work with these end users who have technology as their core business, they are very clear on what is required," Bryant said. "They know what their workloads are, what their various applications are, they know what metric they are looking to hit from a performance per total cost of ownership."

The level of processor and chip customization varies with the workload, data center design, and even cooling solutions. Bryant provided an example where a flexible cooling system in a data center would allow customers to run processors at a higher frequency.

"We will have customers that have a very [specific] power target, so we will create versions whether its through changes in frequency, changes in core count, changes to drive down the power," Bryant said.

Customers usually give information about the applications they are running, the accelerators they need, the performance and power consumption levels they are looking to hit. Intel then customizes processors and chips that meets the specifications. Some customers in the technology and the data center business get specific about the server infrastructure.

"It all boils down to scale. I had one cloud service provider who had told me a single application is running across tens of thousands of servers. You can afford to tune that server very targeted against that application and eke out every bit of performance at ever lower cost of operations," Bryant said.

Intel is also now able to build system-on chip (SoC) designs, in which the CPU is combined with other accelerators, I/O, graphics and other processing units. That makes it easier to build custom processors and chips, Bryant said.

"With our SoC capability now, we can actually do rapid turns of our base product with very unique accelerators. Whether it's voice recognition acceleration or encryption or graphics acceleration... all the different types of accelerators that are targeted at different apps. We can deliver unique products there too," Bryant said.

Intel next year will release Xeon server chips based on the Broadwell processor core, which will succeed Haswell. Bryant said that the server SOC will also help optimize the chip to workloads, be it analytics or cloud.

"We have this wonderful Xeon core, and now Intel has a system-on-a-chip capability where we can rapidly turn out grabbing different intellectual property blocks and accelerators. Why not take this Xeon core and marry it with the SoC capability, and come up with... very [specific] processors targeted at unique capabilities," Bryant said.

In some ways, Intel is taking the same route as Advanced Micro Devices, which is creating custom chips based on its CPU and graphics architectures, but largely for non-server products. AMD's custom chips will be used in the upcoming Sony PlayStation 4 and Microsoft Xbox One gaming consoles.

Intel is also investing in software development to tie applications directly to chip development. The chip maker has released its own version of Hadoop, and is also actively contributing an orchestration layer to OpenStack so resources are effectively allocated at server, storage and network levels in distributed computing environments.

Beyond the server, Intel is also looking to change data center design. One of the projects called Rack Scale aims at decoupling the processing, I/O and storage units in data centers with faster throughput mechanisms.

"Instead of a rack being 24 servers slotted in, with each of those servers with compute, memory and I/O, instead break that artificial barrier of the server down and look at it at the rack level. And create pools of compute, memory and I/O so that the application can access and use whatever capacity it needs," Bryant said.

The company is expected to announced a new optical throughput standard called MXC, which will be detailed at the Intel Developer Forum next month. The company is also developing processors for different target markets, Bryant said. Intel will also announce a new Atom processor called Rangeley for embedded networking devices in early September, ahead of IDF.

"We have hundreds of microprocessor products to cover the entire space," Bryant said.

Agam Shah covers PCs, tablets, servers, chips and semiconductors for IDG News Service. Follow Agam on Twitter at @agamsh. Agam's e-mail address is agam_shah@idg.com


View the original article here

Friday, 16 August 2013

Improve your prose with a word processor cliché detector

Darthdusty asked the Other Software & Services forum about software that will "detect and alert" him when he types certain phrases.

It sounds like you need a cliché detector--software that can catch a word, a phrase, or even a piece of punctuation that you use too often. This can be a valuable tool.

If your word processor has an Autocorrect feature, you can use it to catch certain words or phrases. A number of word processors, including the ones in LibreOffice and Google Docs, have Autocorrect. The specific instructions below are for Microsoft Word 2010 and 2013, but the general idea will work in other programs.

[Email your tech questions to answer@pcworld.com.]

Autocorrect is intended to fix common typos and misspellings. For instances, if you type thier, Autocorrect changes it to their. But it has other uses--such as shortcuts to making typing easier. Because I write a lot about online issues, I've created an Autocorrect entry that replaces int with Internet--I type three letters and get eight.

By the way, the replacement happens when you hit the spacebar, the Enter key, or any punctuation, indicating that you've finished a word. That way, I can type interest without it coming out Interneterest.

To add your own Autocorrect entries in Word, click the File tab, then Options in the left panel. In the resulting Word Options dialog box, click Proofing in the left pane. Then click the AutoCorrect Options button.

Now its time to set up your warning.

Click the Replace field and type the phrase you want to be warned against--for instance, last but not least. Then, in the With field, enter something that will catch your eye and identify the overused phrase, such as !!!!!!!!!!!!!!!!LAST BUT NOT LEAST!!!!!!!!!!!!!!!!!!. Click Add, then OK.

And if, on some occasion, you actually want to type "last but not least," type it, and once Word changes the phrase to your eye-catcher, press ESC to redo the automated change.

Szczecinianin recommended another solution--a program called Repetition Detector--in the original forum discussion. I haven't looked at it.

When he isn't bicycling, prowling used bookstores, or watching movies, PC World Contributing Editor Lincoln Spector writes about technology and cinema.
More by Lincoln Spector


View the original article here