Researchers Go Molecular in Design of a Denser
Chip
New York Times (01/25/07) P. A16; Chang, Kenneth
In an achievement that could pave the way for Moore's law to be extended
beyond the next 10 or 20 years, researchers at the California Institute of
Technology and the University of California, Los Angeles, have developed
the densest memory chip ever built. The chip measures about one-2,000th of
an inch on each side, about the size of a white blood cell, and can hold
160,000 bits of information. Although the chip is far from deployable, the
process by which it was created could potentially be scaled up to a
realistic manufacturing process. "Our goal always was to develop a
manufacturing technique that works at the molecular scale," said Cal Tech
chemistry professor James R. Heath. "It's a scientific demonstration, but
it's a sort of a stake in the ground." The density of bits on the chip is
greater than that of today's chips by a factor of 40, and Heath believes
that enhancements in this technique could lead to an additional increase in
density by a factor of 10. The chip's molecular switches, belonging to the
rotaxanes class of molecules, are shaped like dumbbells with rings that
slide between the two ends of the bar, representing ones and zeros. The
researchers etched 400 parallel wires, less than a millionth of an inch
wide each, deposited a layer of vertically-oriented molecular switches on
top of them, and then placed another layer of 400 wires on top of this
layer of switches, turned 90 degrees from the bottom layer of wires. Each
bit of information is stored at a crossing point between two perpendicular
wires, where about 100 switches are contained. When the chip was tested,
only 30 percent of the bits could be made to work, and the switches broke
after being flipped 10 times, but the chip could still be used to read and
write information. As the project is meant to simply be a demonstration,
"We're just happy it works," says Heath.
Click Here to View Full Article
to the top
Bush Wants H-1B Visa Cap Hike
Computerworld (01/25/07) Thibodeau, Patrick
In a speech to DuPont employees this week, President Bush called for a
raise in the federal cap on H-1B visas and expressed his desire to work
with Congress on the matter. "I understand that we need to make sure that
when a smart person from overseas wants to come and work in DuPont, it's in
our interests to allow him or her to do so," said Bush. In 2006, he asked
Congress for an increase in the number of H-1Bs available each year, but
the measure did not succeed. Material prepared by the White House as a
supplement to the recent State of the Union Address, which did not mention
H-1Bs, said, "Such a program will serve the needs of our economy by
providing a lawful and fair way to match willing employers with willing
foreign workers to fill jobs that Americans have not taken." In response
to this statement, IEEE VP Ron Hira said the IEEE "wholeheartedly
endorse[s] this principle. But the H-1B program does not meet it." As the
H-1B system stands, "employers do not have to search for Americans, and can
prefer an H-1B [visa holder] over an American citizen or green card
holder," added Hira. Congress is expected to debate legislation to
increase the H-1B cap, but it is uncertain whether any action will be
taken.
Click Here to View Full Article
to the top
U.S. Firms Poised to Ramp Up R&D Spending
Wall Street Journal (01/25/07) P. A20; Naik, Gautum
The U.S. will see increases in both private investment and government
spending on R&D this year, maintaining the nation's international lead,
although changes may be needed to preserve this lead into the future,
concludes a new report from Battelle Memorial Institute and R&D Magazine.
The report says that U.S. companies are expected to spend about $219
billion on R&D in 2007, a 3.4 percent increase from last year's $212
billion, and federal contribution to R&D is expected to be around $98.3
billion, a 1.8 percent increase from last year's $96.6 billion.
Electronics, biotechnology, pharmaceuticals, software, semiconductors, and
aerospace are expected to receive the most attention. "The U.S is going
through a period where it's beginning to recover" from the 1 percent or 2
percent annual R&D growth it had been witnessing since the early 1990s,
says Battelle senior analyst Jules Duga. He expects the "next few years"
to see annual growth rates rise to 3 percent or 4 percent. The U.S. leads
all other countries in R&D investment, with 64 percent more spending in the
sector than second place China, but the aging of U.S scientists and
engineers, the lack of government spending on schools, and increasing
foreign competition will most likely present challenges to U.S dominance.
Recent years have seen large portions of U.S. R&D money go to the
development of high-tech anti-terror tools, and the coming years are
expected to see the energy sector receive a growing portion of R&D money.
China, which is spending a great deal on education, has increased R&D
spending at an annual rate of 17 percent. In 2007, Battelle predicts that
5 percent of U.S. industrial R&D will be outsourced. "We'll need a mental
change to acknowledge that unlike 20 years ago, we're not going to be
dominant in every field," Duga says.
Click Here to View Full Article
- Web Link to Publication Homepage
to the top
State of the Union Speech Light on Tech
InternetNews.com (01/24/07) Mark, Roy
Although President Bush's State of the Union Address this week did not
focus on any specific technological issues, he expressed the importance of
technology to several concerns. Improved information technology, Bush
said, is needed to "reduce costs and medical error" in health care; he
called technology "the way forward" in exploring alternative source of
energy; spoke of the need for "new infrastructure and technology" to
improve border control; and said "We can make sure our children are
prepared for the jobs of the future and our country is more competitive by
strengthening math and science skills." Bush also mentioned the need for
Congress to have "a serious, civil, and conclusive debate" on immigration
reform, although he made no mention of H-1B visas, of which all 65,000
allotted for 2007 have already been used. Also left out of the speech was
the previously called for "universal, affordable access for broadband,"
government funding of research, and R&D tax credit reform to encourage
private R&D investment. However, the President's positive attitude toward
technology was well received by industry groups. The Business Software
Alliance's Karen Knutson says, "Technology is such an underpinning for
everything he talked about. There were some pretty good things in there,
particularly about the need to focus on math and science education."
Several other groups praised Bush's acknowledgement of the role technology
plays in energy concerns. "This strong commitment will continue the
bipartisan progress we are making to provide greater energy security,
global competitiveness and enhanced protection for our environment," said
TechNet CEO Lezlee Westine.
Click Here to View Full Article
to the top
Questions Remain in D-13 Undervote Controversy
Bradenton Herald (FL) (01/25/07) Marsteller, Duane
A recent study by electronic voting experts has concluded that the cause
of the undervote in Florida's 13th Congressional District election may not
be found without a thorough investigation of the voting machines involved.
The race's location on the ballot could be partially to blame, although
machine failure is still a possibility, according to Cornell professor of
government Walter Mebane, who conducted the study with Stanford University
computer science professor David Dill. The report also says that
"personalized electronic ballot" cartridges used to operate the machines
could have caused the undervote, and that an above-average frequency of
"invalid vote" error messages were found on machines that showed high
undervote rates, displaying the possibility of malfunction. "The situation
is complicated," Mebane says. "It's hard to say, 'Here's the one thing
that's the origin of the undervotes.'" Democrat Christine Jennings, who
lost by only 369 votes although 18,000 ballots had no vote for the race,
believes the report proves her stance that the situation can only be
resolved by an independent study of the machines' source code. However,
Mebane and Dill said that even such a study might not provide conclusive
answers. "It is an unfortunate fact that no feasible amount of testing or
examination of modern computer systems can rule out machine malfunctions,
which are often subtle and unreproducible," the report says. For
information on ACM's e-voting activities, visit
http://www.acm.org/usacm
Click Here to View Full Article
to the top
Computing Awarded NSF Grant to Broaden CS Pipeline
Georgia Institute of Technology (01/23/07)
The National Science Foundation has awarded a $2 million grant to the
College of Computing at Georgia Tech to develop programs that will help get
more students involved in computer science. The grant from the NSF's
Broadening Participation in Computing Initiative targets students at every
educational level, as well as from historically underrepresented groups,
with hopes of attracting more undergraduate and graduate students to
computer science programs. The programs that Georgia Tech creates will
serve as a model for developing larger programs across the country. They
could include partnerships with youth organizations at the state and local
level, mentorship programs for college students, workshops to improve the
educational approach of computer science programs, assistance with
curriculum ideas, and improving the communication of the results of new
methods to programs that are interested in changing their curriculum. "In
anticipation of the expansive and extensive impact that technology will
continue to have on our culture and society, it is imperative that
educators engage a broader base of potential computer science students,
particularly women and minorities, through more contextualized and
appealing methods and practices," says Georgia Tech professor Mark Guzdial.
"With this grant, the College of Computing at Georgia Tech has an
excellent opportunity to integrate a new and highly-creative approach to
computer science education across the learning spectrum--from kindergarten
to college, and beyond."
Click Here to View Full Article
to the top
Some Vista, Office Innovations Spring From MS
Research
eWeek (01/24/07) Galli, Peter
Microsoft Research played a key role in the creation of several Vista and
Office 2007 features, including Vista's desktop search, Sidebar, and
SuperFetch, as well as Office's ribbon-based user interface. Susan Dumais,
principal researcher of Microsoft Research's Adaptive Systems and
Interaction Group, developed a prototype for Vista's desktop search
feature, which was distributed to 3,000 Microsoft employees, allowing
Dumais and her team to understand the challenges associated with the
information retrieval project, "such as how users know a lot of information
about what they are searching for and often remember specific
characteristics," she says. Sidebar, which displays customizable "gadgets"
on the desktop, began as a Microsoft Research prototype called Sideshow.
Vista's SuperFetch, which is designed to know what applications a user
opens most frequently and in what order so it can prepare them for use, was
a result of principal researcher and research area manager for Microsoft
Research's Adaptive Systems and Interaction group Eric Horvitz's goal of
making operating systems understand users as well as context. Horvitz says
that machine learning allows SuperFetch to have "encoded, deep within the
system, a sense for the frustration that people may feel when they wait for
a response." Office's ribbon interface was the product of the Human
Centered Computing group's and the Visualization and Interaction Research
Group's meeting with user researchers to simplify the Office interface. An
analysis of volunteer users into what features were used most often and
what features were used in conjunction others provided the groups with the
information needed to create the ribbon.
Click Here to View Full Article
to the top
Sentimental Journey
CIO (01/23/07) Schindler, Esther
Businesses are beginning to see value in software that can understand and
respond to human emotion. Some programs that understand the way a user is
feeling could help them make better decisions, and others could improve
customer service methods. W3C Emotion Incubator Group Dr. Marc Schroder
believes that if computing does not become more "natural," the average user
will no longer be able to handle increasingly intricate interaction with a
machine, referring to the natural cues we can take just by looking at
someone's face before they say something. Software has been developed that
can detect emotion in a human voice or in text, and businesses use these
systems to detect levels of emotions, or choose proper wording, for
example. The Human-Machine-Interaction Network on Emotion (HUMAINE)
research community aims to establish the groundwork for software that can
understand, replicate, or influence human emotion, called "emotion-oriented
systems." Schroder says the group focuses on "input, reasoning, and output
of a run-time system, and also annotation of recordings of human behavior,
which can be analyzed, e.g. by machine-learning algorithms." The group
expects to produce an XML-based standard representation format. Several
research projects are developing consumer devices that make emotion part of
the information being transmitted, through text messages whose color
changes based on the sender's emotional state that is picked up by
biosensors and other sources, for example. IBM Almaden Research Center
manager of cognitive computing Dharmendra Modha says that data management
has dealt with structured data in the past, but as the processing of
unstructured data such as human emotion becomes a reality, the goal will
for computer science to put together the architecture of the mind.
Click Here to View Full Article
to the top
Novel Computed Imaging Technique Uses Blurry Images to
Enhance View
University of Illinois at Urbana-Champaign (01/22/07) Kloeppel, James E.
University of Illinois researchers have established a computational
image-forming method for optical microscopy that is able to quickly
generate clear 3D images from data that is out of focus. The method, known
as Interferometric Synthetic Aperture Microscopy (ISAM), is expected to do
for optical microscopy what magnetic resonance imaging did for nuclear
magnetic resonance. "ISAM can perform high-speed, micron-scale,
cross-sectional imaging without the need for time-consuming processing,
sectioning and staining of respected tissue," said ISAM paper author
Stephen Boppart, a professor of electrical and computer engineering, of
bioengineering, and of medicine at UI. Using an understanding of the
physics of light-scattering in a sample, the technique uses a
broad-spectrum light source as well as a spectral interferometer to produce
crisp, reconstructed images. Boppart says ISAM has the potential to help
the field of cell and tumor biology and to allow for imaging to replace
biopsy in clinical diagnosis. While 3D optical microscopy techniques used
today need the instrument's focal plane to be scanned through the area
being examined, ISAM unscrambles the light from out-of-focus image planes,
expanding the region of the image that is in focus to provide a
high-resolution image. "We have demonstrated that the discarded
information can be computationally reconstructed to quickly create the
desired image," said UI research scientist Daniel Marks. In the future,
ISAM could allow micron-scale imaging for large amounts of tissue.
Click Here to View Full Article
to the top
It Just Comes Naturally
San Francisco Chronicle (01/25/07) P. C1; Saraker, Pia
Despite the small percentage of jobs in the technology industry held by
African Americans, and the even smaller percentage for executive positions
in the industry, Michael Fields, CEO of Kana, a software company, remains
dedicated to seeing greater diversity in the sector. An African American
himself, Fields has been president of Oracle's domestic operations and
started his own software company, OpenVision. He has purchased an office
building to dedicate to female- and minority-run businesses, set up a
training program at the College of Alameda, and is currently working to
provide scholarships to students in the Virgin Islands that are interested
in technology. According to the 2000 U.S Census, African Americans make up
12 percent of the population but only 6.7 percent of computer science
professionals, 3.9 percent of engineers, and 7.5 percent of engineering and
science technicians, and these numbers are an increase from the 1990
census. Wayne Hicks, a former president of Black Data Professionals, which
is dedicated to bringing more African Americans into technology, blames
these percentages on low graduation rates and lack of exposure to
technology jobs. Both Hicks and Fields feel that given the small number of
African Americans in the industry, young African Americans do not have much
of a network to tap into, but as more enter the industry, the youth will be
given greater opportunities. Hicks points out that as technology changes,
and as more opportunities arise in more fields, "Technology is [a] piece of
business instead of off to the side, which means there's lots of
opportunity. It's not just strictly, 'Can you write a program?'" While
Hicks does acknowledge racism, he says it is no more of a barrier than in
any other field.
Click Here to View Full Article
to the top
'Sniffer-Bot' Algorithm Helps Robot Seek Scents
New Scientist (01/24/07) Inman, Mason
Researchers in France have developed an algorithm that could allow robots
to find the source of a faint scent even in the midst of air turbulence,
much like a moth does. Massimo Vergassola and some of his colleagues at
the Pasteur Institute in Paris tested their simple algorithm in a virtual
environment and found that it not only allowed a virtual robot to
successfully track and find the source of a scent, but it caused the
virtual robot to move in complex back and forth sweeping motions, s-curves,
and spirals that closely resemble the way a moth tracks a scent. The
algorithm uses information received from the scent itself as well as
information received when the scent is not detected, striking a balance
between heading directly toward the point where it guesses the scent is
coming from and wandering around collecting information but not making any
progress toward the source. Vergassola says the algorithm could be
implemented in an actual robot or be used for other applications that
involve searching without much information, such as detecting the best
paths for information to be sent through a network. This research
"provides a new framework for understanding a large and significant class
of problems encountered in real world situations," says the University of
Pennsylvania's Alan Gelperin. He adds that by adding instruments that
could gather information about the airflow around a robot, the algorithm
could even be improved.
Click Here to View Full Article
to the top
Tanenbaum Outlines His Vision for a Grandma-Proof
OS
Computerworld Australia (01/24/07) Dahdah, Howard
Computers should have a lifetime failure of zero, just like other
electrical appliances such as TVs or stereos, according to operating
systems expert Dr. Andrew Tanenbaum, a professor of computer science at
Vrije University in Holland. Speaking at the linux.conf.au last week,
Tanenbaum said operating system software would need to be smaller to
improve the reliability of today's software. Adding unnecessary features
only makes software slower and buggy, he stressed, while noting that RAID
arrays and ECC memory as hardware devices are capable of correcting errors
on the fly. "So I think we need to go in the direction of self-healing
software," explained Tanenbaum. The creator of the MINIX 3 operating
system, Tanenbaum said the code in the OS kernel should be limited, and
also modular, and he suggested that components such as drives and file
systems should be isolated to prevent any problems from spreading. MINIX
makes use of many of the features Tanenbaum discussed, and Linux is based
on it. "Maybe the direction Linux could go would be [as] the system that
is ultra reliable, that works all the time and has not got all the problems
that you get in Windows," Tanenbaum noted.
Click Here to View Full Article
to the top
Report: Among Tech Execs, Men Face Gender Wage Gap
eWeek (01/24/07) Perelman, Deborah
The average difference between the wages of men and women in technology
jobs declined from 10.9 percent in 2005 to 9.7 percent last year, and some
female professionals now earn more than their male counterparts for some
job titles, according to a new report from Dice, a career site for IT and
engineering professionals. Female help desk professionals made 4.8 percent
more than males by averaging $40,937, female technical writers made 2.5
percent more at $73,816, and female IT executives (CEOs, CIOs, chief
technology officers, vice presidents, and directors) made 1.4 percent more
at $109,912. Women between the ages of 18 and 24 earned about as much as
men of the same age ($41,700 compared to $41,722), but the wage gap
increased to 7.6 percent between 25 and 29 years of age, and to at least 10
percent for all age groups over 30. Overall, IT salaries rose 5.2 percent
as the average increased from $69,700 to $73,308. The greatest gains were
enjoyed by professionals who specialize in enterprise resource planning
($96,161), Sarbanes-Oxley compliance ($91,998), and customer relationship
management ($90,499). Entry-level salaries rose 13.1 percent to $42,414,
in an effort to attract more workers to the industry. The best pay was
found in Silicon Valley ($90,310), followed by Boston ($80,308), New York
($80,006), and Baltimore/Washington, D.C. ($79,911).
Click Here to View Full Article
to the top
The Open Source Initiative Still Lives
Linux-Watch (01/23/07) Vaughan-Nichols, Steven J.
Despite recent silence, the Open Source Initiative (OSI) has been rather
active, evaluating open source licenses and platforms, and regaining its
role as an open source authority. Begun in 1998, OSI had the final word on
what was truly "open source," but in following years the group began to see
new licenses, such as ungoverned "vanity licenses" that companies would
grant themselves, and Mozilla Public Licenses (MPLs), as hindrances to open
source development. In the opinion of many, OSI must approve a license for
it to be considered a true open source license. Additionally, "badgeware"
attached to MPL licenses, which prominently displayed the logo of the open
source company responsible for the code being used, caused additional
criticism of these licenses. OSI has never stopped meeting, and results
have recently been seen. "Somewhere around a half-dozen licenses have been
withdrawn. The job isn't done--the board still has some policy decisions
to make," says OSI co-founder and former leader Eric S. Raymond, who left
for a while but has returned to the board. As signs of success, new
members have been added to the board, and Intel, Sun, and several other
companies have dropped their own OSS licenses. OSI promises to be more
open in its future proceedings, citing difficulty in the deployment of its
new Drupal-based Web site, which will include minutes from board meetings.
"Open source is a big buzzword again now, and yes there are those who are
trying to understand how they can embroider over the edges of open source
to achieve business goals nearly but perhaps not perfectly aligned with the
spirit of the Open Source Definition," says OSI secretary/treasurer and
Intel senior director of open source strategy Danese Cooper. She
encourages those who have disputes with OSI to join the current dialog.
Click Here to View Full Article
to the top
'Storm Worm' Trojan Horse Surges On
CNet (01/22/07) Espiner, Tom
Security firms are dismayed by the aggressive Trojan horse that was
unleashed on computers around the world last weekend. They do not know who
is behind the attack, from where it was launched, and are still trying to
understand the extent of the botnet associated with the Trojan "Storm
Worm." According to antivirus vendor F-Secure, Storm Worm started Friday
as an email about storms in Europe, which sought to get recipients to
download an executable file to read the news story, and six subsequent
attacks over the weekend similarly tried to woo readers with news such as a
missile test by China or the death of Fidel Castro. F-Secure adds that
each version of the emails was capable of updating itself, which prevented
most antivirus programs from detecting it. "The bad guys are putting a lot
of effort into it--they were putting out updates hour after hour," says
Mikko Hypponen, director of antivirus research at F-Secure. The
compromised machines, possibly hundreds of thousands of home computers,
were turned into zombie machines for a botnet that acted like a
peer-to-peer network, in that it has no centralized control. Attackers
tend to control botnets through a central server, which can be located,
ultimately allowing zombie networks to be taken down and destroyed.
Click Here to View Full Article
to the top
Even Experts Have Difficulty Understanding Web 3.0
Daily Yomiuri (Japan) (01/23/07) P. 18; Jerney, John
Though there has been a considerable amount of talk about Web 3.0 and what
it will actually entail, there has been little agreement. Many would say
that Web 3.0 will bring meaning to the content and the links of the
Internet, by incorporating a certain level of artificial intelligence,
which has been called the Semantic Web; rather than searching for
information, users could ask the Semantic Web specific technical questions
and get concise answers without being forced to look through hundreds of
pages. Volksware President John Jerney believes such a vision of the
Semantic Web is unrealistic, as it would necessitate a "vast change in the
way we store information on the Web" and "a quantum leap in our
understanding of cognition and reasoning." Instead, Jerney proposes an
idea he calls the Executable Web, wherein normal people could "aggregate,
process, and visually interpret information from divergent sources with the
same level of ease as they start a blog today." This "read-write" Web
would be built on currently available technology, whereas the Semantic Web
is based on technology that is still in various stages of the research
phase. In order for the Executable Web to become a reality, a user module
and interface is needed to enable the foundational infrastructure to make
sense to the everyday user. Jerney points out that given the proliferation
of mobile devices, network hosting applications make perfect sense, as they
could lead to the combination and simplification of mobile devices needed
to "leverage the power of the Web."
Click Here to View Full Article
to the top
Preserving Printed and Digital Heritage
BBC News (01/22/07) Geist, Michael
Governments should take a proactive role in the preservation of printed
and digital publications through the construction of libraries, and
University of Ottawa Internet law professor Michael Geist offers several
suggestions for achieving this. Revising legal deposit regulations to
account for online publications and to handle concerns brought to the fore
by digital technologies that could potentially hinder access is an action
taken by the Canadian government, which Geist cites as "a model for
preserving online publications." Three years ago the government empowered
the Library and Archives Canada (LAC) to sample Web pages as part of an
initiative to preserve notable Canadian Web sites, and new rules were
introduced this year in consideration of online publications and
DRM-encoded books. Under the new regulations, a publisher is defined as
anyone "who makes a publication available in Canada that the person is
authorized to reproduce or over which the person controls the content,"
while online publications are ascribed a unique title, a specific author or
authoring body, a specific date, and the purpose of public consumption.
Many publishers who release their content solely online will be required to
start submitting their works to the LAC, but there is no requirement for
all publishers to submit electronic versions of printed documents, and
Geist argues that such a requirement should be a subject of future
consideration. Another new rule requires publishers to decrypt encrypted
data in a publication and to eliminate or deactivate systems that restrict
or limit access to the publication prior to LAC submission. Additionally,
the LAC must be supplied with a copy of the software and technical
information needed for accessing the publication, as well as any metadata
connected to the online publication.
Click Here to View Full Article
to the top
IEEE 802.11n Working Group Approves Draft 2.0
InfoWorld (01/22/00) Schwartz, Ephraim
IEEE 802.11n version 2.0 has been approved by its working group, with all
contending parties that had delayed the process having been satisfied, and
will now be voted on by the end of March 2007 and most likely modified into
version 3.0, which would then go through a lengthy balloting process and
finally reach publication around October 2008. The new version will "only
require a minor firmware upgrade for complete compatibility" with
pre-802.11n products, according to working group member Bill McFarland.
The biggest change concerned the implementation of the 40 MHz channel,
which has been modified to accommodate older 2.4 GHz band devices. Two 20
MHz bands will be used in the new spec, which will scan its environment for
legacy devices that may not be able to comprehend the wider bandwidth, and
use a single 20 GHz band for them. Although this feature would cause
slower performance, the new spec's Multimedia In Multimedia Out (MIMO)
technology will speed up performance. Another change will let an 802.11n
device check to see if both channels are free before any data is sent, and
a third change will allow devices to send signals using WLAN to indicate
that 40 MHz mode should not be used. The new spec will allow higher
throughput (approximately 129 Mbps real world) than today's standard and a
range that is 50 percent greater; also, multiple antennas will allow
fragmented signals to be stitched together, eliminating places indoors
where signals are often dropped.
Click Here to View Full Article
to the top
Custom Processing--Transcript
ACM Queue (01/07) Vol. 4, No. 10, Vizard, Michael
IBM chief scientist Peter Hofstree discusses system on a chip (SOC) and
related design issues and the potential impact of the cell processor on the
SOC marketplace. "I think in many ways the system on a chip era has
already arrived and cell I think is an example of that and maybe one of the
more clear examples," says Hofstree. He explains that the cell's ability
to capture a number of diverse market segments fuels the move toward
integration and SOC, and notes that we are midway through the tipping
point. "It ... will drive us towards specialization of cores, where
different cores will be optimized for different functions, and in the case
of cell, a core that is focused on the control function and then another
one that is focused on the compute function," projects Hofstree. He
describes a cell as a microprocessor equipped with an integrated memory
controller, lots of bandwidth, an on-chip coherence fabric, and an IO
controller. Hofstree foresees cell-based SOC designs eventually becoming
the most prevalent way for people to construct next-generation dedicated
computers, at the very least, if the software standardization challenge is
met.
Click Here to View Full Article
to the top