Data Breach Notification Law Unlikely This Year
IDG News Service (05/05/06) Gross, Grant
While Congress seemed poised to work swiftly to pass a data breach
notification bill after the highly publicized security failures in the
first half of 2005, such legislation now appears unlikely to materialize
before this year's session expires. There have been more than 10 bills
introduced since 2005 addressing when companies are required to notify
their customers in the event of a data breach that could compromise
sensitive personal information and whether consumers should be allowed to
freeze their credit scores in the wake of such a breach. In addition to
the conflicting provisions of the different proposals, five congressional
committees have asserted jurisdiction over the legislation. Two bills have
emerged from committee in the Senate and are awaiting debate on the floor
and two more are pending on the House floor. The Senate and House are
looking to adjourn for the year on Oct. 6 to give legislators a month to
campaign for the November elections, and they will both be out of session
for most of August. The remainder of their time will most likely be
focused on hot-button issues such as immigration and rising gas prices.
The future of the bills remains uncertain despite bipartisan support for
the issue, said James Assey Jr., a Democratic counsel in the Senate
Commerce, Science, and Transportation Committee. "It's unclear what
Congress will do," he said at an ACM conference. "Going into the next
Congress, I fell certain these issues will return." One of the most
contentious points in the debate is what should trigger a notification
requirement, given that companies will have an obvious incentive to
downplay the severity of a breach to their customers.
Click Here to View Full Article
to the top
As Outsourcing Gathers Steam, Computer Science Interest
Wanes
Computerworld (05/05/06) Thibodeau, Patrick
The gathering momentum of outsourcing IT work as a business strategy
correlates to a declining interest in computer science among U.S. students,
according to a Computing Research Association (CRA) study that found a 17
percent drop in the number of bachelor's degrees in computer science
awarded at Ph.D.-granting universities in the 2004-05 academic year
compared with the previous year. The widely reported declines threaten to
choke off the availability of highly trained, entry-level IT workers. The
waning interest can principally be attributed to the rise in offshoring,
the continued perception that IT is a volatile field in the wake of the
dot-com bust, and the generally lackluster IT job growth. In the high-tech
sector, IT employment rose 1 percent from 2004 to 2005, the first increase
since 2001. That increase, which brought the number of high-tech IT jobs
to 5.6 million, was weighted down by the slumping telecommunications
sector, which saw the elimination of 42,000 jobs between 2004 and 2005. In
that same year, software jobs increased by 32,000, and the Bureau of Labor
Statistics is predicting a 48 percent increase in the number of software
engineers by 2014. Computer science enrollments have dropped by half from
the 2000 mark of around 16,000, in what CRA's Jay Vegso describes as a
"delayed reaction to the 2001-2002 slowdowns in the tech sector." Others
argue that the declining enrollment in computer science paints a darker
picture than the reality, as many companies are strategically hiring
candidates from a variety of backgrounds, including liberal arts, and
providing them with IT training on the job. "The world is asking for
completely different type of professional," said David Foote of the
consultancy and research firm Foote Partners.
To read "Globalization and Offshoring of Software: A Report of the ACM Job
Migration Task Force," visit
http://www.acm.org/globalizationreport
Click Here to View Full Article
to the top
Chip Power Breakthrough Reported
Wall Street Journal (05/08/06) P. B6; Clark, Don
The tiny Silicon Valley firm Multigig has reported a breakthrough in the
synchronization of the functions of computer chips that could resolve the
pressing issue of power consumption and improve the clock circuitry
currently deployed in numerous kinds of chips. Clocks can consume more
than half of the power of some chips, most of which is wasted by the
one-way flow of energy from electrical pulses. The energy consumption
problem has prompted companies such as Intel to essentially abandon the
practice of increasing clock speeds to improve computing performance.
Multigig founder John Wood developed a technique of sending electrical
signals around square loops, imitating the working of a conventional clock
where most electrical power gets recycled. The company reports a 75
percent energy savings over traditional clocking techniques. The multiple
loops help to synchronize the timing pulses, combating the effect known as
skew where electrical pulses arrive at slightly different times and
undermine the clock's precision. Multigig is in talks with chip makers to
license the technology, which could be used for synchronizing the
frequencies of communication chips as well as microprocessors. "This is a
dramatic way of clocking circuits," said Gartner's Steve Ohr, adding that
it could be years before manufacturers incorporate the research into
commercial products. "Intel is not going to redesign the Pentium tomorrow
because of it."
Click Here to View Full Article
to the top
Mapping a Path for the 3D Web
CNet (05/08/06) Terdiman, Daniel
Leaders in the fields of video game design, social networking, geospatial
engineering, software development, and other areas met for the first
Metaverse Roadmap Summit to discuss a future increasingly defined by
immersive, virtual environments such as Google Earth and Myspace.com.
Participants at the event, produced by the Acceleration Studies Foundation
(ASF), held a series of discussions and presentations to formulate a
picture of what the metaverse, or the 3D Web, will look like in 10 years.
Event organizers will cull through transcripts of the sessions to formulate
a written document that reflects the prevailing conclusions of the summit.
Consensus was in short supply, however, as many attendees seemed to take
exception with the notion that a prevailing 3D Web will be in place by
2016. Much of the discussions centered on augmented reality, in addition
to the research breakthroughs required to facilitate the 3D Web.
Participants were unable to agree on whether a prevailing 3D Web should be
proprietary or open source, though most acknowledged that Microsoft and
Google would be the most likely companies to provide the tools required to
build such an environment. Attendees also agreed that mobile devices will
play a larger role in the development of an immersive 3D Web as they become
capable of performing more of the functions of a desktop. Despite the lack
of agreement on many of the issues under discussion, the general feeling
among participants was positive. "I'm not necessarily a huge believer in
central planning of technological and cultural advances," said Corey
Bridges, co-founder of the Multiverse Network. "But happily, that's not
what we're doing here. We are identifying areas to explore. We're seeing
mountains in the distance and saying, 'There's something there, someone
should go investigate it.'" The ASF plans to continue holding full roadmap
summits every two years.
Click Here to View Full Article
to the top
Welcome to the New Dollhouse
New York Times (05/07/06) P. 2-1; Schiesel, Seth
The massively popular domestic simulation video game The Sims has found
its greatest audience in children, and among girls in particular,
shattering the conventional perception of video games as a male-dominated
activity given to violence and competition. No points are awarded in The
Sims, and players cannot win; they can merely try to live the best life
possible given their circumstances and propel their characters toward
happiness. Since its introduction in 2000, The Sims has sold more than 60
million copies around the world. Psychologists agree that playing with
dolls is an important tool for self-discovery among children and especially
girls, and The Sims represents the migration of that activity from 3D
plastic models such as Barbie to a virtual environment. To build on that
connection, Carnegie Mellon doctoral candidate Caitlin Kelleher helps lead
a workshop that encourages girls to take in interest in computer
programming through the use of interactive storytelling software.
Electronic Arts, which makes The Sims, reports that more than half of its
players are female, a marked departure in an industry where males typically
account for more than three-quarters of the customer base. Many girls
report that their interest in The Sims wanes as they approach their late
teens and the emotions and situations that they simulated in the game,
particularly their relationships with boys, become a more important part of
their real lives. James Gee, a professor of education at the University of
Wisconsin, sees the declining interest in video games among girls
corresponding to their pursuit of computer science as a course of study.
"They give up their interest in video games around the same time they give
up their interest in science and math and that's a real problem because
boys use video games to cultivate an interest in technology, and if girls
give that up we're going to continue to see a real gender imbalance in
these areas."
Click Here to View Full Article
- Web Link May Require Free Registration
to the top
Hard Questions While Waiting for the HPCS
Downselect
HPC Wire (05/05/06) Vol. 15, No. 18,
The author asks several questions to be considered while awaiting this
summer's High-Productivity Computing Systems (HPCS) downselect from the
Defense Advanced Research Projects Agency (DARPA). The first query
explores whether shared-memory and distributed-memory system architectures
provide inequitable opportunities for deploying dynamic load balancing,
while the second query asks what additional elements must be incorporated
within an already formidable collection of heterogeneous system hardware to
produce a genuine heterogeneous system architecture. The author describes
an optimal heterogeneous system architecture as having high global system
bandwidth to allow long-range communication latency and to streamline the
programming load by alleviating the performance nonuniformity of memory
accessing; a wide assortment of combined, compile-time, runtime, and
hardware parallelism mechanisms that would adjust to dynamic variation of
the amount, kind, and granularity of an application's parallelism; a
variety of similar mechanisms that would perform the same function, only
this time focusing on the application's locality; and diverse work queues
of fine- and medium-grained parallel activities that would be dynamically
self-scheduled by various kinds of various "virtual processors." A
cost-effective system architecture must employ all of these mechanisms to
yield high performance without overtaxing global system bandwidth. A
heterogeneous architecture's most important benefit is its ability to build
a hybrid system architecture integrating properties of parallel von Neumann
machines and parallel non-von Neumann machines. The author observes that
designers are agreeable with parallelism diversity but churlish toward
locality diversity, and notes that articulating the full vision of
heterogeneous processing makes sense from a crusader's perspective.
Click Here to View Full Article
to the top
Lab Aims to Make Items Disabled-Accessible
Associated Press (05/06/06) Bluestein, Greg
Companies are turning to Georgia Tech Research Institute's accessibility
division for testing to ensure that their electronic devices meet the
federal guidelines for making products accessible for people with
disabilities. The federal statute, known as Section 508, requires
companies to upgrade their electronic devices for the disabled in order to
sell them to the federal government. Copy machine companies have responded
to the federal guidelines by incorporating tactile displays and voice
controls into their products. Georgia Tech researchers have worked with
Ricoh to improve accessibility, and the office equipment manufacturer has
settled on a final model for its copiers that includes tilted screens for
wheelchair users and other improvements. The institute offers more
comprehensive testing than the procedures carried out in-house at
companies, and as an independent facility it is more concerned with the
accuracy and integrity of the examination than whether a product passes or
fails. The lab was launched during the Cold War to test the usability of
military systems and other items, and half of its work remains focused in
this area. "It's hard to get more real than military testing," says senior
research scientist Brad Fain. "When your life is on the line, every move
counts."
Click Here to View Full Article
to the top
Five Technologies You Need to Know About
TechWeb (05/03/06) Jones, George
A quintet of emerging technologies have the potential to dramatically
improve computing in terms of efficiency, performance, and functionality
very soon. Ajax (Asynchronous JavaScript and XML) is a technique, or
rather a frame of mind, whose integration of Web development technologies
results in highly practical and responsive Web sites and services; AJAX
harnesses JavaScript's client-based functionality and XML's efficient and
direct delivery of specific data to enable developers to construct Web
pages that are as responsive as desktop pages. Intel's upcoming Core
microprocessors, which are slated for release in the second half of this
year, promise to boost chip performance and speed while lowering operating
temperatures, and can cooperate in dual-core and multi-core assemblies.
NAND flash memory is ideal for use in solid-state hard drives because it
can read large files and quickly erase and write data, supporting lower
power consumption, faster read/write times, and better reliability. This
upholds the expectation that affordable storage drives with no moving parts
will be realized as flash memory becomes less expensive and more pervasive.
Holographic storage technology can boost disk storage capacity almost
10-fold through the use of 3D imaging, and support substantially higher
data transfer rates. InPhase Technologies has promised to roll out the
first commercial holographic storage product later in 2006, and subsequent
products are expected to boast even more capacity. AMD-Virtualization
(AMD-V) from AMD and Virtualization Technology (VT) from Intel may
accelerate and disentangle software-based virtualization by encoding
virtualization capability within hardware, allowing desktops to run
multiple tasks without impacting CPU performance.
Click Here to View Full Article
to the top
Pundits Discuss the Internet's Future
Wall Street Journal Online (05/05/06)
ICANN Chairman Vint Cerf and CNET Networks editor at large Esther Dyson
weigh in on the future of the Internet. Both agree that the Internet will
be ubiquitous in the future, pervading every aspect of our lives, but where
they differ is in the specifics that they stress. Cerf stresses the
growing interlinking of broadband and mobile devices and sees the Web
gradually becoming integrated with entertainment and consumer electronic
equipment, then household and office equipment, then our cars, and then our
bodies. He sees broadband reaching Mars through an Interplanetary Internet
by the close of the decade. Thus users on Earth will be able to view and
control future missions to Mars and eventually other planets. Cerf
believes that the domain name system must be outfitted with a capability to
process different languages. Cerf writes, "The Internet reaches only about
a billion users so there are another 5.5 billion to go. It is beginning to
include a good deal of information in many languages, but the domain name
system needs to be outfitted with a similar capability." Dyson agrees that
the Internet's spread is inevitable and says that, eventually, every thing
will have an online identity. Dyson writes, "The Internet so far has
existed mostly in cyberspace, linking computers fed data by humans and by
other computers. The Internet of the future will be much more tightly
linked to physical space. First of all, many of its future users will
connect via cell phones, and the Net will know more about their physical
locations and their identities than it does about those who reach it by
computer." However, she says that with the ubiquity of the Web also will
come an erosion of user privacy. She says one of the big challenges is
determining who controls the vast quantities of information a Net-connected
future entails.
Click Here to View Full Article
to the top
Change in ICT Design for Visually Impaired Urged
Silicon Republic (05/04/06) Larkin, Elaine
Information technology will leave the visually impaired behind if the
issue of accessibility is not realistically addressed, according to the
Visually Impaired Computer Society (VICS), part of the Irish Computer
Society. The lobby group says the use of tactile, audio, or large print as
human computer interface features would make computer products more
accessible for the visually impaired, but the conventional computer screen
prevents them from using electronic systems. "At present the vast majority
of ICT products are completely unusable by those with vision problems
except by means of expensive and inelegant bolt-on interfaces," says IT
professional Ronan McGuirk, founding member of VICS. McGuirk has authored
a paper that calls for the implementation of the Design for All (DFA)
principles during specification, design, and manufacturer of ICT products.
VICS, which plans to introduce the paper May 12 in Dublin, says it would be
cheaper to incorporate accessibility during the design stage than to do so
later on, and adds that better labeling is needed to make accessible
products easier to identify. VICS Chairman Tony Murray says Apple, which
has built a screen reader into its operating system for the new Mac, is one
of the few companies that has embraced the process. Murray, a software
engineer at AIB, adds that VICS is at work drafting a paper on standards
for accessible products.
Click Here to View Full Article
to the top
Cyberspace Running Out of Room
TechWeb (05/04/06) Sullivan, Laurie
Frost & Sullivan says the popularity of smart phones, IPTV, and other
gadgets will soon force cyberspace to run out of room. By 2012, about 17
billion devices will connect to the Internet, according to IDC. Experts
also say the current Internet protocol version 4 (IPv4) limits services of
multimedia content and data communication, including mobile IP, P2P and
video calls. The Office of Management and Budget is requiring all federal
networks to have the ability to send and receive IPv6 packets by the middle
of 2008. Presently, just 30 percent of the Internet service provider
networks will support IPv6 by 2010, and 30 percent of user networks by
2012, according to a study by the National Institute of Standards and
Technology (NIST) and the RTI International on IPv6 migration. The
NIST/RTI study estimates it will cost the United States $25.4 billion
between 1997 and 2025 just to upgrade from IPv4 to IPv6. Some equipment
manufacturers will offer the transition to the IPv6 protocol in regular
upgrades, while some will charge a fee for the features. Sam Masud at
Frost & Sullivan advises companies to start building transition strategies
now. "Lucent says they can convert all their mobility applications to
support IPv6 in about three years," says Masud. "It wouldn't surprise me
if that self-imposed time table is extended." Companies can currently
choose from three migration strategies--dual stacks, tunneling, and
translation.
Click Here to View Full Article
to the top
Richard Stallman Sets the Free Software Record
Straight
Linux Insider (05/02/06) LeClaire, Jennifer
In a recent interview, Free Software Foundation President Richard Stallman
discussed his thoughts on the movement that has become his life's work, the
draft update of the General Public License (GPL), and how the GNU Project
is often conflated with open source. Stallman looks at free software as an
essential human right, including the freedom to run a program in any way
the user desires, freedom to study and change the program's code, freedom
to copy and distribute the software, and the freedom to distribute altered
versions of the code at any time. The desire to safeguard those freedoms
became the foundation for the GNU operating system. Linux, initially
released in 1991, was re-licensed under the GNU GPL in 1992 as free
software. Its growing popularity for practical applications throughout the
1990s meant that many Linux users were oblivious to the ethical motives
behind the GNU Project. Stallman rejects the label "open source" for his
project because it is based on a more pragmatic philosophy that ignores the
loftier ideals of free software. While it only departs from the previous
version in the details and offers no new sweeping ideals, version 3 of the
GPL contains an explicit patent license grant to apply around the world, as
well as a patent retaliation provision to deter anyone from trying to sue
for infringement on a GPL-covered application. GPLv3 also contains a
provision to prevent what Stallman calls "tivo-ization," meaning that a
device powered by free software will not run on modified versions of its
source code. "Tivoization turns freedom 1, the freedom to make and use a
modified version of the program, into a sham," Stallman said. Another
important provision in the draft update calls for international application
of the license, regardless of the country's specific language or
intellectual property laws.
Click Here to View Full Article
to the top
Cybersecurity Research Plan Identifies Threats
Federal Computer Week (05/01/06) Vol. 20, No. 13, P. 54; Sternstein, Aliya
Industry leaders who have been urging the Bush administration to devote
more resources to cybersecurity are optimistic that a recently issued
report by the National Science and Technology Council will lead to
increased federal funding. The "Federal Plan for Cyber Security and
Information Assurance Research and Development" highlights urgent threats
to U.S. technological infrastructure and calls for increased federal
funding for research that would help manufacturers incorporate greater
security features into their products before they are delivered. "This is
the first document that I've seen that focuses on outcomes rather than
favorite research projects," said Alan Paller of the SANS Institute. The
document recommends exploring security issues that could arise from new
broadcast protocols, wireless protocols, and ad hoc networks, while also
cautioning against potential threats from optical computing, quantum
computing, and pervasively embedded computing. The plan also calls for
much needed metrics to gauge the government's ability to hold up against an
attack, Paller said, though he criticized the council for not including
specific figures for how much the government should pay for the research.
Included in the proposal are the public Internet and the networks and
systems that control the power grid, communications systems, and other
vital elements of infrastructure. The plan identifies software testing,
wireless security, access control, and authentication as some of the
highest funding priorities. Increased funding is the key to acting on the
report's recommendations, says Ed Lazowska, who served as co-chairman of
the now-defunct President's IT Advisory Committee. Lazowska said he has a
simple message for John Marburger III, the president's science advisor:
"Spare me the recommendations and show me the money," adding that "it's
time for leadership and investment."
Click Here to View Full Article
to the top
IDEs of Change
Electronic Design (04/27/06) Vol. 54, No. 9, P. 52; Wong, William
Parallel to the consolidation of the integrated development environment
(IDE) space is an increase in IDE sophistication, and general IDEs that are
open to plug-ins from third parties are rising to the top of the graph.
The complexity of developing an IDE lies in the many assorted features an
IDE has and the support it delivers, and one approach to IDE development
can be made through the target operating system or platform and the
programming language. An editor, a compiler tool chain, and a debugger are
usually the fundamental constituents of an IDE, but the growing complexity
of programming projects has nurtured a similar growth in IDE complexity.
IBM's open-source Eclipse platform, which makes the plug-in specification
and underlying IDE available to any third party, enables third parties to
focus on the Eclipse platform while also allowing their products to be
employed by any user working with an Eclipse-based platform. Strategies
for maintaining IDE simplicity include the improvement of developers' IDE
experience through the use of wizards, profiles, and enhanced help
mechanisms that notify developers about features or simplification of
feature access; avoidance of turning the IDE into an all-encompassing
system; and concentration on a specific environment. Macraigor Systems
President Craig Haller notes that most debugging methods used by developers
remain relatively unchanged after three decades, and though more complex
tools are offered by numerous debuggers, the tools are often hard to use,
target a specific IDE, or are simply unknown to developers. Some debugging
improvements have originated outside of IDEs.
Click Here to View Full Article
to the top
A Universal Translator in Your Pocket
New Scientist (04/27/06) Vol. 190, No. 2549, P. 26; Graham-Rowe, Duncan
Computer translation researchers are improving translation systems by
giving them the ability to learn new languages on their own. They are
turning to software to train translation systems on large amounts of text
and the usage of different types of words in various positions in sentences
on a day-to-day basis, and unlike previous rule-based programs, the
application does not get confused by exceptions to grammatical rules or bad
grammar. "After decades of stagnation, something major is happening to
create the technologies we have always dreamed about," says Alex Waibel,
director of the International Center for Advanced Communications
Technology. Researchers at Carnegie Mellon University in Pittsburgh have
developed the TransTec (Translation Systems for Tactical Use) program--a
handheld device with speech recognition, translation, and voice synthesis
software--that the U.S. military would like to use to translate spoken
conversations with Iraqis in real time. Waibel also has a team at the
University of Karlsruhe in Germany that has trained a system on European
Parliament session speeches, in order to develop an application that can
translate lectures in real time and run on systems that are more powerful
than handheld computers. However, processing power of handheld devices
will continue to improve, and some observers are optimistic that in the
next couple of years people will be able to speak into their cell phone to
translate their words for someone else, and have the other person speak
into their phone to translate their response. The advances could
eventually lead to a universal translator that whispers translation in your
ear as you carry on a conversation. Google is considering taking advantage
of translation technology, as are the developers behind the new European
search engine Quaero.
Click Here to View Full Article
- Web Link May Require Paid Subscription
to the top
SysML Effort About to Bear Fruit
SD Times (05/01/06)No. 149, P. 15; Moore, Alan
Artisan Software Tools' Alan Moore, architect of the SysML 0.99
specification, expects the spec to be adopted by the Object Management
Group as a technique for modeling complex systems. SysML 0.99 is the end
product of the SysML Merge Team (SMT), whose participants include some of
the most prestigious tool vendors, leading industry users, professional
organizations, and government bodies. The SysML visual modeling language
extends UML 2 to facilitate complex system specification, analysis, design,
and validation. SysML reuses a subset of UML 2 concepts and diagrams, and
enhances them with some novel diagrams and constructs that are applicable
to systems modeling. The spec's major structural extension, the Block, is
employed as a general-use hierarchical structuring tool that defines a
system as cluster of components and links between them that effect
communication and other types of interrelationships. The other major SysML
extension is support for requirements. SysML offers extensions to
activities to characterize how material, energy, or information is
distributed throughout a system, permitting modelers to prescribe
limitations on the rate at which items flow along edges in an activity, or
inside and outside of behavioral parameters. SysML describes system
properties and their relationships through the use of parametric models,
which requires the presence of a ConstraintBlock to define a series of
parameters and one or more expressions that state how a change in one
parameter's value affects the other parameters' values.
Click Here to View Full Article
to the top
RFID: Beyond the Drive for Five
Design News (04/24/06) Vol. 61, No. 6, P. 48; Murray, Charles J.
Although radio-frequency identification (RFID) tags have not yet reached
the much-desired nickel price point, the tremendous strides the technology
has made in terms of cost and performance should not be discounted. The
price of RFID chips has been falling about 5 percent to 10 percent a year
for the past six years, concurrent with technological improvements; RFID
tags are being used in applications that were unheard of 10 years ago,
regardless of the failure to cut their price down to five cents a unit.
RFID tags are expected to be incorporated into low-cost everyday objects,
which will eventually lead to an "Internet of things" wherein virtually
everything is networked through the Web, predict researchers. The Internet
of things cannot be realized without low-cost RFID tags, but researchers
anticipate that everyday items will include RFID via integration into the
corrugate of cardboard boxes during manufacture. Ongoing initiatives in
this area will play a vital role in reducing the price of RFID, because it
removes the need for certain tag components. Price is still one of the few
advantages bar codes have over RFID tags, which is why efforts to drive the
tag cost down to five cents are still going strong. Among the techniques
RFID chip makers are employing to lower costs is "self-adaptive silicon,"
which generates special transistors featuring gates that can store bits of
memory. Sanjay Sarma, associate professor of mechanical engineering at
Massachusetts Institute of Technology and research director for MIT's
Auto-ID Center, says, "These RFID technologies will co-exist with the bar
code for a long time into the future. But they will provide information
that a bar code can't...The question now is the tipping point. When do you
get to the percentage that causes you to say, 'I'm going to put the tag
inside the corrugate?' In the next year, we could see it happen."
Click Here to View Full Article
to the top