E-Voting Measures Sought to Avoid Disputes
San Jose Mercury News (CA) (11/29/06) Davies, Frank
California Senator Diane Feinstein, who will take over the Rules and
Administration Committee that oversees how federal elections are run, has
made it clear that she will scrutinize the e-voting process. "It's
imperative that Congress does everything it can to help ensure that votes
cast are recorded accurately," Feinstein said. "Serious questions have
arisen about the accuracy and reliability of new electronic voting
machines." Even before the previous election, in which Sarasota Country,
Fla., confirmed the concerns many had about e-voting, she had been planning
legislation, similar to one that failed to pass in the House by two votes,
mandating a paper trail for all electronic voting systems in the country.
Electiononline.org's Doug Chaplin said, "At first I thought there were lots
of fender-benders on Election Day but no major pile-ups. But Sarasota is a
pile-up." State officials and voting machine manufacturers are being
pointed at to do a better job of testing and auditing equipment before
elections. Republicans are pushing for legislation ensuring voter ID and
fraud prevention, and Feinstein herself wants to outlaw state election
officials from taking part in a federal candidate's campaign committee.
Feinstein worries that continued problems, in a district that has greater
national ramifications than Sarasota County, or worse, in a presidential
election, will lead to a harmful loss of confidence in the nation's ability
to conduct elections. Stanford University computer science professor David
Dill said lost votes complaints following the 2004 elections were not
adequately investigated. He says, "The complaints need to be investigated
urgently, or machine problems will lead to more disputed elections in the
future." For information about ACM's e-voting activities, visit
http://www.acm.org/usacm
Click Here to View Full Article
to the top
The Problem With Programming
Technology Review (11/28/06) Pontin, Jason
Bjarne Stroustrup, who invented C++, explains in this interview that he
still stands by the language he built, and thinks that most programming
being done now is below par. While there is quality software out there,
like Google, he says, "looking at the 'average' piece of code can make me
cry. The structure is appalling, and the programmers clearly didn't think
deeply about correctness, algorithms, data structures, or maintainability."
Rather than being sure of a system's quality and why it works so well,
Stroustrup says programmers are "in a constant state of grasping at straws
to get our work done. The snag is we often do not know how we did it: a
system just 'sort of evolved' into something minimally acceptable." In
order to remedy this situation, he thinks that education must be improved,
using "more-appropriate design methods, and design for flexibility and for
the long haul." However, this fix is difficult to achieve because computer
users do not want to be inconvenienced by abrupt changes; only a gradual,
wide-ranging effort toward change will be effective. "Software developers
have neutralized the astounding performance of modern computer hardware by
adding layer upon layer of over-elaborate [software] abstractions," says
Stroustrup, whose solution is that more experts should be trained to use
C++, as it as fallen out of the mainstream, rather than simply "dumb[ing]
down" programming languages. He says the generality built into C++ was the
result of his "view that to do higher-level stuff, to build complete
applications, you first needed to buy, build, or borrow libraries providing
appropriate abstractions." Stroustrup believes that the large amount of
criticism that has been aimed at C++ is a testament to how useful it really
is.
Click Here to View Full Article
to the top
Design Automation Conference Announces Executive
Committee
Business Wire (11/27/06)
Steven P. Levitan, the former chair of the ACM Special Interest Group on
Design Automation (SIGDA), will serve as the general chair of the executive
committee for the 44th Design Automation Conference (DAC). Dr. Levitan is
the John A. Jurenko Professor of Computer Engineering in the Department of
Electrical and Computer Engineering at the University of Pittsburgh, and he
also has a joint appointment with the Department of Computer Science. He
will guide the committee of volunteers from the electronics and electronic
design automation (EDA) industry in planning and overseeing management of
DAC's operations, including technical programs, exhibitions, new projects,
and publicity. Levitan, an expert in design, modeling, simulation, and
verification of mixed technology micro-systems, has been involved in DAC
executive committees since 1998. The 44th DAC is scheduled for June 4-8,
2007, at the San Diego Convention Center in San Diego, Calif. Carnegie
Mellon University's Diana Marculescu will serve as the ACM/SIGDA
representative on the executive committee. Other members of the executive
committee include Limor Fix of Intel Research, Leon Stok of IBM, Sachin
Sapatnekar of the University of Minnesota, Yervant Zorian of Virage Logic,
Ellen M. Sentovich of Cadence Berkeley Labs, Narendra Shenoy of Synopsys,
Andrew B. Kahng of the University of California at San Diego, Kaushik Roy
of Purdue University, Nanette V. Collins of Nanette V. Collins Marketing
and PR, Georges Gielen of Katholieke University in Belgium, and Yusuke
Matsunaga of Kyushu University Kagus in Japan. "With the combined energy,
expertise and experience of this remarkable group of volunteers driving it,
we are looking forward to a very strong conference in San Diego next June,"
says Levitan. For more information on DAC 2007 visit
http://www.dac.com/44th/index.html
Click Here to View Full Article
to the top
Vote Disparity Still a Mystery in Fla. Election for
Congress
Washington Post (11/29/06) P. A3; Whoriskey, Peter
Florida's 13th Congressional District is still trying to get to the bottom
of why there were no votes cast for Congress by 18,000 Sarasota County
residents who voted for candidates in other races. Some claim that the
touch-screen voting system had a glitch that dropped votes, others that a
confusing ballot caused voters to overlook the race, and finally that
voters simply decided not to vote in this particular race, a possibility
that has received little support. "Our analysis of the results show that
something went very wrong," says Kendall Coffey, attorney for challenger
Christine Jennings, who is currently being declared the loser of the race,
pending further investigation. Coffey dismissed a mock election that
showed no signs of machine malfunction, in which clerical workers, not
ordinary voters, used the machines to place votes. While 2.5 percent of
voters did not cast a vote in every race in other Florida counties, a
phenomenon known as "undervoting,"15 percent undervoted in Sarasota County.
Two different election experts who had their own troubles with the voting
machines support the theory that the machines are to blame, and the
Sarasota Herald-Tribune reports over 100 reported problems with the
machines. The confusing ballot idea is supported by the CalTech/MIT Voting
Technology Project's director, MIT's Ted Selker, who claims that his own
tests show 60 percent of voters possibly missing races that are displayed
in the way that the race in question was, but Coffey claims that such a
high profile race is very unlikely to be simply forgotten or overlooked by
so many voters.
Click Here to View Full Article
to the top
Declining Comp. Sci Enrollment Levels Off
Yale Daily News (11/29/06) Balakrishna, Kanya
The number of computer science majors is no longer on the decline at Yale
University, says Computer Science Department Chairman Avi Silberschatz, and
Stan Eisenstat, director of Undergraduate Studies, believes the number will
rise next year, which would mark the first increase in five years.
Reflecting a nation-wide trend, Yale has seen its number of computer
science majors fall from 71.5 students in the 2001-02 academic year to 24.5
last year. There has not been a comparable decline in enrollment in Yale's
M.S. and Ph.D. programs. Nationally, the number of the computer science
majors in 2005 was half of the total from 2000, and the Computer Research
Association says the number of students pursuing a compute science degree
has dropped 70 percent between 2000 and 2005, according to data compiled by
the Higher Education Research Institute at the University of California,
Los Angeles. Meanwhile, other Ivy League schools have also reported large
swings in the number of computer science majors; at Harvard University,
computer science majors fell from 174 in 2001 to 67 in 2005, while at
Princeton University, students majoring in computer science fell to 14 last
year, down from 36 in 2000-2001. Although Eisenstat believes the apparent
rebound of dot-coms may have something to do with the leveling off of
computer science enrollment, computer science major Nick Piepmeier believes
Web 2.0 is too much of a niche market to be such an influence. "I feel
like in the aftermath of the bust people are finally realizing that it's
still really easy to get jobs in the computer industry, and that there's
still money to be made there," says Piepmeier, who serves on the
departmental Student Advisory Committee. He also believes the number of
computer science majors will pick up in the next few years.
Click Here to View Full Article
to the top
Engineers Seek to Equip Operating Room of the
Future
JHU Gazette (11/27/06) Vol. 36, No. 12, Sneiderman, Phil
The operating room of the future could be filled with robots, visual
displays, and digital workstations, according to engineers and computer
scientists at the National Science Foundation Engineering Research Center
for Computer-Integrated Surgical Systems and Technology. Researchers at
the center on the campus of John Hopkins University believe their work has
the potential to improve the safety of surgery, and allow surgeons to
proceed with operations that otherwise would have unlikely been pursued.
The robotic systems are meant to provide assistance to surgeons, and not
replace them, center director and computer science professor Russell H.
Taylor cautions. For example, a team of researchers has designed a
snakelike robot to allow surgeons to be more precise in making incisions
and tying sutures when operating in the throat area. Another team is
developing a steady-hand system that is designed to offset uncontrolled
hand movements with cooperative manipulation techniques, which should
enable surgeons to have greater success in microsurgery. Such robotic
assistants, visual displays guiding surgeons through procedures, and
digital workstations offering instant access to medical information would
all be connected to computers, which could serve as a black box for the
operating room, and provide clues into why certain techniques are
successful. The researchers believe the technology will one day appear in
the operating room, but say several more years of testing and further
development are needed.
Click Here to View Full Article
to the top
Backseat Virtual Reality Entertains Passengers
New Scientist (11/24/06) Simonite, Tom
Researchers at the Interactive Institute in Stockholm, Sweden, are testing
an in-car gaming system that allows passengers to play an interactive game
based on the buildings, forests, and rivers passed along a route while
driving. The Backseat Playground uses such landmarks encountered during a
trip to build a story, complete with in-game characters and events. The
game matches sights for events in an adventure that might involve a murder
mystery or a werewolf thriller, and makes use of a GPS receiver to provide
geographical data, a handheld computer for player interaction as the story
builds, and headphones for players to listen to phone calls and
walkie-talkie messages from in-game characters. A laptop in the trunk,
which correctly positions the car in the virtual world, connects the GPS
receiver, handheld computer, and headphones. "We are trying to suggest
spaces and places and events and have the user fill in the gaps to build a
narrative," explains John Bichard, who developed the interactive game with
colleagues Liselott Brunnberg and Oskar Juhlin. The computer scientists
are considering integrating voice recognition into the game, which would
allow players to talk directly to the characters. Rob Aspin, with the
Center for Virtual Environments at Britain's University of Salford is
intrigued by the way in which content is delivered for the game. "It can
create a high sense of presence and interaction while hiding most of the
technology from the user," says Aspin.
Click Here to View Full Article
to the top
Computers at MSU Take the Lead in High-Speed Studies of
Evolution
Lansing State Journal (MI) (11/26/06) Miller, Matthew
The Digital Evolution Lab at Michigan State University is home to
computers that simulate the evolution of billions of organisms at rates
that would be impossible to observe in the natural world, and shed a great
deal of light not only onto evolution, but computer science as well. "If
you think of natural organisms, it takes months to years for a generation
to go by for a sophisticated organism," says lab director Charles Ofria.
"With these digital organisms, we can have a generation go by every
second." Computer scientists have begun to use such observations of
evolution at work to create strange and exciting innovations. The digital
organisms that "live" on the computer's circuitry are simple programmed to
self-replicate, but each time they do so, there is the possibility of a
mutation occurring. Ofria and California Institute of Technology's Chris
Adami created a program called Avida, where they can create habitats in
which the organisms must try to survive, in order to stimulate natural
selection; when the organisms are able to adapt, they are rewarded with
extra computer processing time that allows them to reproduce faster. After
many generations, the organisms carry stronger genetic codes, a process of
finding solutions that is of great benefit to computer scientists: "In a
sense, they're teaching us a shorter way of writing good code," says Ofria.
The organisms self-replicate, randomly mutate, and thus adapt, "in such a
way it would be extremely difficult for a human programmer to condense
these millions of problems into a single, relatively seamless solution,"
Lenksi says.
Click Here to View Full Article
to the top
DOD Report to Detail Dangers of Foreign Software
Computerworld (11/27/06) Anthes, Gary
The Defense Science Board (DSB), a military/civilian think tank within the
Defense Department, has conducted a study into the security of software
developed overseas, and will make recommendations to the DoD based on its
finding, but will not advise that all military software be created within
the United States. Chairman of the task force Robert Lucky explains that,
"The problem is we have a strategy now for net-centric warfare--everything
is connected. And if the adversary is inside your network, you are totally
vulnerable." The private sector has already experienced changes based on
the task forces findings, although many see this attitude as simply
xenophobia, stating that all software should be scrutinized equally. Lucky
says that users should aim to make trade-offs between the amount of risk
and the economics of creating a given piece of software. Protective steps
cited by the DSB are: Peer reviews where several programmers review and
test code; utilizing scan tools to search for hidden malware; and enforcing
industry quality standards; and while each of these remedies is not a
perfect fix or prevention, the combination will effectively "raise the
bar," as Lucky says, and "eliminate a certain percentage of problems."
However, those such as Ira Winkler, author of "Spies Among Us," feels that
a single line of foreign-written code contained in U.S. military software
is too much a security risk. While such a policy would ideally ensure
against foreign malware, there are few, if any, U.S. software companies
whose products do not contain any code written overseas, and according to
Lucky, "we're talking about complexity that boggles the mind. It's so
enormous that no can truly understand a program with millions of lines of
source code."
Click Here to View Full Article
to the top
Software Patent Conference Outlines Problems, Possible
Solutions
NewsForge (11/27/06) Bisbee-von Kaufmann, Samuel Kotel
Problems with software patents and possible solutions were the focus of
the Nov. 17 "Software Patents: A Time for Change?" conference hosted by MIT
and Boston University Law School. A temperature reading of the current
software patent situation was taken during the first panel discussion,
which mentioned companies' acquisition of software patents to defend
against litigation from rivals seeking to generate profits from their own
portfolios, and the opaque definition of patentable software by the
European Patent Office, among other things. Bronwyn Hall of the University
of California Berkeley Graduate School and the University of Maastricht
observed in the second panel discussion that the growth of software patents
does not reflect their value, which is for the most part nonexistent.
Participants in the third panel pointed to the lack of consideration the
World Wide Web Consortium had for patents initially because of concerns
about technology; the creation of monopolies and the impedance of
innovation by patent portfolios; and various reasons for the lack of
emphasis on patents by entrepreneurs and startup firms. Legal
ramifications were covered in the fourth panel, with panelists noting that
a thing's patentability is a matter of perspective. For example, it was
University of Akron School of Law professor Jay Draftler's opinion that a
thing can only be designated an invention if it requires technical risk and
thus the risk of failure. The final panel discussed possible reform
strategies, and suggestions ranged from greater disclosure in patent
applications via the required deposition of source code to increasing
understanding of problems within the U.S. Patent Office through the
provision of one-page documents to the creation of business incentives.
Click Here to View Full Article
to the top
Bye Swarmbots, Hello Swarmanoid
Wired News (11/28/06) Cole, Emmett
Free University of Brussels in Belgium researchers are developing a swarm
of 60 small, autonomous, and specialized swarm bots, known as the
"swarmanoid," because the swarm is made of specialized robots. Marco
Dorigo, project leader and research director at the university's IRIDIA
lab, says the swarm consists of "footbots" that are based on earlier,
uniform swarm bots and move objects along the ground, "handbots" that climb
walls, and "eyebots" that can attach to the ceiling to use their visual
sensors; there are even swarm bots that will fly. By using specialized
swarm bots, the same kind that can be seen in the division of labor in ant
colonies, the swarmanoid will be able to complete more customized tasks,
such as household chores or retrieving an object for a humanoid. Georgia
Institute of Technology associate professor in interactive and intelligent
computing Tucker Balch explains that, "For robots to really make an impact
on the world, we have to get lots of robots into people's hands. The two
barriers are cost and utility, but it becomes feasible with the swarm idea,
which would allow households to buy several inexpensive robots that could
work together. The view of swarms consisting of all the same robots just
isn't going to take off." Swarms are also being considered for use at the
micro or nano level for procedures inside the human body. Dorigo says he
hopes to publish his work towards the end of 2007, with experimental
results ready in about two years.
Click Here to View Full Article
to the top
Harnessing Grid Computing to Save Women's Lives
IST Results (11/29/06)
The accuracy of breast cancer diagnoses is receiving a valuable boost from
grid computing. The rate of misdiagnosis of breast cancer can be as high
as a 30 percent due to differences in individuals, equipment, procedures,
and problems using the computers that detect changes in breast tissue.
MammoGrid, an IST project that ended in August 2005, produced software that
provides medical professionals access to digital mammograms stored across
Europe. A geographically distributed, grid-based database consisting of
30,000 standardized images from 3,000 corresponding patient data, allows
mammograms of current patients to be compared with others and subjected to
detection algorithms to identify possible concerns. "The system in its
current version allows a user to securely share both resources and patient
data which has been treated to ensure anonymity," says Maat Gknowledge's
David Manset, who served as leader of the project. Such an innovation
brings about a new level of statistical analysis for breast cancer in its
many forms, which will hopefully save many lives. The technology is
currently being expanded to new hospitals and tested for its ability to
meet market demands, before hopefully being expanded across all of
Europe.
Click Here to View Full Article
to the top
Canada Experts Find Path Round Internet Firewalls
Reuters (11/28/06) Dabrowski, Wojtek
People living in countries that overly restrict Internet access and block
Web sites will be able to circumvent the firewalls of their government
using new software developed by computer researchers at the University of
Toronto. The program, Psiphon, is designed to turn an Internet user's
computer essentially into a server that someone in another country can use
to browse the Internet away from the watchful eyes of their government.
Psiphon allows anyone living in a country that allows unfettered access to
the Internet to set up their account, and then enable someone in a more
restrictive country to log on from that computer. The free download, which
will be available starting Friday, offers encrypted and secure Internet
surfing for users, which will prevent their government from tracing their
Web surfing patterns. "The communities that we're helping to connect to
each other have a legitimate right to exercise their human rights within
this government regime," says Ron Deibert, director of the university's
Citizen Lab, who also acknowledges that Psiphon might be unlawful in those
countries. "It does conflict with some sovereign states' values, but there
are competing legal norms at work."
Click Here to View Full Article
to the top
Super-computer Boss Has a To-Do List for a Better Future;
Better Medical Care, Disaster Preparation Are Goals
Triangle Business Journal (11/27/06) Horlbeck, Fred
Dan Reed, the new director of the Renaissance Computing Institute in
Chapel Hill (RENCI), has big plans to user supercomputing to bring about
weather-response, medical, and economic change in North Carolina. RENCI, a
joint venture between the UNC-Chapel Hill, Duke, North Carolina State, and
the state of North Carolina, utilizes a state-or-the-art computer network,
including the IBM Blue Gene/L supercomputer, the second fastest in North
Carolina, and data information technology that allows high levels of
collaboration. Reed sees great potential in the "intersection of
disciplines," where people and organizations provide knowledge, abilities,
and ideas to solve problems more efficiently. RENCI's power, with help
from weather-related organizations, will bring about "high-resolution"
weather forecasting, according to Reed, that will be able to track a
storm's path as well as predict the exact location of flooding. Where
medical advancements are concerned, RENCI will alert doctors immediately as
to anomalies occurring in patients that have been given a special device to
wear, hopefully allowing a crisis to be avoided. Reed predicts that in 10
years, an individual's genome sequence could be profiled, providing doctors
with specific vulnerabilities to disease. To aid economic advances, RENCI
will be able to take in corporate data and locate areas of growth in order
to devise ways to create further growth. Reed says, "This is one of the
first attempts to do this in the U.S. We're trying to bring people
together from across the state. We're trying to be a catalyst for
innovation."
Click Here to View Full Article
to the top
Smart Spaces: If These Walls Could Talk
Computerworld (11/27/06) Anthes, Gary
The concept of "smart spaces" has been around for quite some time, and
while the technology required for the individual components exists today,
interoperability, accuracy, and reliability prove to be stumbling blocks.
Different types of sensors, large touch-screen displays, cameras,
microphones, and other devices were incorporated into a prototypical
"interactive room," or iRoom, by Stanford researchers, which utilizes the
Interactive Room Operating System (IROS), a metaoperating system that they
describe as having "taken the operating system idea to the space level, so
people can coordinate their work in an environment with multiple devices,"
says Stanford computer science professor Terry Winograd. The goal in such
a project, as Winograd explains, is to maximize seamlessness and
transparency, because, "Whenever you have to stop focusing on what you care
about to focus on how the machine is doing, you lose fluency." IBM
Research senior manager for responsive enterprise solutions Stefan Hild,
who worked on an IBM prototypical interactive office, explains that, "The
investment of taking an office building and enabling it that way is fairly
high. But you can get 80 percent benefit with 20 percent of the cost."
While such an investment could pay off, technology needs to make some
progress first. Hild recognizes that turning an office building into a
completely interoperable and interactive, real-time environment would
require drastically scaling up networks and processors.
Click Here to View Full Article
to the top
It's a Woman's World Wide Web
New Scientist (11/25/06) Vol. 192, No. 2579, P. 58; Biever, Celeste
Wendy Hall claims, "There is nothing traditional or geeky about me," and
this is only the beginning of the way in which she shatters stereotypes in
an IT world dominated by men: She is currently the head of the University
of Southampton's world-class electronics and computer science department, a
senior VP of the Royal Academy of Engineering, VP of the ACM, and sits on
the Council for Science and Technology, which advises the prime minister;
all this despite being told she didn't get her first job because of her
sex, and having many of her later ideas ignored by male counterparts.
After teaching herself to program in the 1980s, Hall was attracted to the
ability that computers had to improve people's lives: "I could see what
could be possible once the technology developed." She launched a program
called Microcosm in 1989: A database of electronic photos, documents, and
recordings that could be linked to each other in different ways, depending
on the user. Links were created in real time while the document was read
by the user by comparing the contents of a given document and related
contents of the hard drive, so the links could be shown dynamically based
on the user's browsing habits. However, Tim Berners-Lee's invention of the
World Wide Web took off instead of Microcosm, because its links were
embedded and it worked on a global network that could be accessed by anyone
with an Internet connection, while Microcosm could be used only in
standalone hard drives. Recently, Hall has been involved with the creation
of the Web science research initiative, which will focus on the
relationship between computer science and social science. She says merging
the two disciplines could attract more women to computing, a cause she
champions because computer science is a field she loves. Hall says, "All
the wonderful things I am doing are because I am a computer scientist. IT
and computing are the basis of everything."
Click Here to View Full Article
to the top
Learning Through Multimedia: Automatic Speech Recognition
Enhancing Accessibility and Interaction
University of Southampton (ECS) (11/26/06) Wald, Mike
Researcher Mike Wald demonstrates the enhancement of learning and teaching
quality via automatic speech recognition (ASR) to access, manage, and
leverage online multimedia content. His presentation shows that ASR
technology can help guarantee that both in-person learning and online
learning is universally accessible via the cost-effective generation of
synchronized and captioned multimedia. According to Wald, this strategy
accommodates preferred learning/teaching approaches, and can help those who
have problems taking notes because of cognitive, sensory, or physical
difficulties. In addition, the approach can aid learners with the
management and mining of online digital multimedia resources, as well as
offer automatic speech captioning to hearing-impaired learners or any
others to whom speech is unavailable, unsuitable, or inaudible. Users with
blindness or other visual impairments can also benefit from the method,
which helps them read and search learning material through the enhancement
of synthetic speech with natural recorded real speech. Furthermore,
teachers as well as learners can improve their spoken communication skills
through reflection afforded by ASR. "Although it can be expected that
developments in ASR will continue to improve accuracy rates, the use of a
human intermediary to improve accuracy through correcting mistakes in real
time as they are made by the ASR software could, where necessary, help
compensate for some of ASR's current limitations," Wald writes. The
projection of text onto a large screen has had some success in classroom
situations, but many circumstances call for the provision of an individual
personalized and customizable display. Wald concludes that the ideal
system for digitally recording and replaying multimedia content would
automatically produce a mistake-proof transcript of spoken language that is
synchronized with audio, video, and any graphical elements, which would be
displayed in the most suitable manner on diverse instruments and with
adjustable replay speed; annotation would be provided via pen or keyboard
and mouse, and have synchronicity with the multimedia content.
Click Here to View Full Article
to the top
The Ultimate White Light
Scientific American (12/06) Vol. 295, No. 6, P. 86; Alfano, Robert R.
Optical data transmission could achieve unprecedented speeds with the
advent of "supercontinuum" (SC) laser light, which melds useful properties
of laser light with the broad bandwidth spectrum of white light, writes
City College of the City University of New York professor Robert Alfano.
Alfano pioneered SC light with Stanley Shapiro at General Telephone and
Electronics Laboratories (since renamed Verizon) in 1969. SC light is
primarily generated today by transmitting high-intensity pulses of laser
light through specially designed microstructure fibers. The light and the
fiber material interact through a series of nonlinear optical processes
that extend the light's bandwidth. One such process is self-phase
modulation. SC light can be applied to provide extremely accurate
frequency measurements and clocks, detection of airborne chemicals such as
pollutants and aerosols, and high-resolution medical imaging via optical
coherence tomography. High throughput telecommunications with data
transmission rates that beat current systems by a factor of 1,000 is
another application of SC light, one with more immediate commercial
ramifications.
Click Here to View Full Article
- Web Link to Publication Homepage
to the top
The Future of Simulation: A Field of Dreams?
Computer (11/06) Vol. 39, No. 11, P. 22; Yi, Joshua J.; Eeckhout, Lieven;
Lilja, David J.
There is a growing reliance on simulators among computer architecture
researchers because simulation can strike a balance between cost,
flexibility, and timeliness, and the diversity of benchmarks, methodology,
and data sets raises numerous questions about whether simulators suitably
model processor or system behavior, what the essential elements of future
simulators are, ways to design benchmark suites with representative
benchmarks without excessive redundancy, etc., write Freescale
Semiconductor's Joshua Yi et al. Addressing these questions was the
impetus behind a panel discussion on simulation infrastructure, benchmarks,
and simulation methodology at the International Symposium on Performance
Analysis of Systems and Software in March 2004. The efficient traversal
and characterization of the design space is problematic, and among the
alternatives the authors recommend are analytical models, statistical
simulation, and specialized trace-driven simulation. Analytical modeling
and statistical simulation offer faster speed but less accuracy than
cycle-accurate simulation, but the speed advantage is more critical because
relative accuracy is usually enough to track substantial shifts in
processor performance, while deployment time is fractionally shorter. A
major benchmarking problem is the lack of certainty in the
representativeness of average benchmark suites, and solving this problem
entails the computer architecture community's introduction of additional
benchmark characterization and classification techniques, specifically
those that offer greater accuracy or efficiency than current approaches.
Benchmark length and the simulation time this translates into is another
problem, and the use of sampling-based techniques such as SMARTS and
SimPoint is recommended by the authors as a mitigation strategy. Notable
problems with current simulation methodology include ad hoc simulation,
which can be addressed via comprehensive documentation and justification of
the methodology and the addition of more statistical rigor. Solving
reproducibility and comparability problems stemming from widely variable
simulation workloads involves agreement among the research community on
well-balanced processor and memory hierarchy configurations, common
benchmark subsets, and common data sets.
Click Here to View Full Article
- Web Link to Publication Homepage
to the top