Volume 7, Issue 852: Monday, October 10, 2005
- "U.S. Cybersecurity Due for FEMA-Like Calamity?"
CNet (10/10/05); McCullagh, Declan
The Department of Homeland Security's cybersecurity division has been
coming under increasing scrutiny as critics have drawn a parallel to the
lack of preparedness evidenced by FEMA in the wake of Hurricane Katrina.
The flight of many senior staff members is one of the most alarming signs
of weakness in the division. The top position in the cybersecurity unit
has been unoccupied since Robert Liscouski quit in January, though Homeland
Security Secretary Michael Chertoff has promised to fill the vacancy. Many
in the industry believe there is little appealing in that position, as its
occupant is far more likely to meet with blame than praise. Legislation
has been reintroduced in Congress this year that would establish the
position of assistant secretary for cybersecurity, who would report to
Chertoff directly. Currently, cybersecurity is removed from the top
position by multiple layers of bureaucracy. The Homeland Security
Authorization Act for 2006, which contains the cybersecurity
reorganization, has passed in the House, but has been stalled in the Senate
Homeland Security committee since May. The Sept. 11 attacks signaled a
clear departure from cybersecurity as a top government priority, as
preparedness for hypothetical threats gave way to the realities of
responding to Al Qaeda and the invasion of Iraq. Many federal
cybersecurity efforts had been characterized as ineffectual before Sept.
11, and the consolidation of divergent programs into the DHS did little to
advance government readiness to contend with spyware, spammers, and cyber
terrorists, as the department has labored under incompatible computer
systems and inefficient data sharing practices. The department counters
that measures such as the creation of the National Cyber Alert System
demonstrate a response effort coordinated with private industry, though
University of Washington computer science professor Ed Lazowska says the
department still has a lamentably narrow focus on cyber security.
Click Here to View Full Article
- "In a Grueling Desert Race, a Winner, But Not a Driver"
New York Times (10/09/05) P. A24; Markoff, John
A robotic vehicle known as Stanley earned the Stanford researchers who
created it a $2 million prize after winning the Grand Challenge, a 132-mile
race through the Nevada desert. The Pentagon-sponsored race was designed
to advance the cause of artificial intelligence and self-guided vehicles,
and saw 23 entries from coalitions representing universities, automotive
firms, and computer and aerospace companies. DARPA designed the
competition to scour sources of innovation and research that might
otherwise go unnoticed by the leading military firms. The second Grand
Challenge boasted much greater success than the first event held in March
2004, in which many vehicles did not get out of the starting gate and none
finished. Two entries from Carnegie Mellon, a Hummer and a Humvee,
finished behind the Stanford team's modified Volkswagen Toureg. Stanley
averaged about 17 miles an hour on a course that tested the vehicles'
navigational abilities across mountain passes and through tunnels in which
the vehicles could not receive satellite navigation signals. Throughout
the race, DARPA monitored for illicit radio signals that the teams might
use to guide their vehicles, which, under the rules of the Grand Challenge,
had to complete the course without human intervention. The competition was
born out of a Congressional directive for the Pentagon to automate
one-third of the military's land vehicles by 2015. While much of the
self-guiding hardware has not changed since the first Grand Challenge,
there have been significant advances in the software. The official Web
site of the race counted more than 12 million hits in eight hours.
Click Here to View Full Article
(Access to this site is free; however, first-time visitors must register.)
- "U.S. Won't Lose Its Tech Edge, Says Microsoft's Mundie"
IDG News Service (10/07/05); Ribeiro, John
Microsoft CTO Craig Mundie believes the United States will maintain its competitive advantage over emerging powers such as China and India due to
its strong capacity for innovation. Mundie's remarks come amid increasing
industry concern that a lack of government funding for research and
development will erode innovation and compromise the United States'
position as the global leader in technology. Speaking in Bangalore at the
inauguration of a Microsoft research center there, Mundie noted that India
will be a focal point for innovation due to the nation's well-trained
workforce, though he called for greater protection for intellectual
property to facilitate the growth of native Indian businesses. The central
challenge will be in the commercialization of the inventions produced by
India's engineers. The advances in software seen in India have been
impressive, though the country has yet to produce businesses that target
the specific demands of the market. To keep up with demand, India needs to
steer a greater portion of its students toward computer science, and of
those, there must be some with a specific concentration in the business of
software, in addition to the actual development.
Click Here to View Full Article
- "MIT Designs Low-Tech Flood Alarm"
Boston Globe (10/10/05); Lloyd, Marion
A group of students from MIT has been developing a flood warning system for Honduras' Aguan River to augment the satellite-powered system that was introduced in the wake of Hurricane Mitch in 1998. Maintenance of the
existing system is costly, and it is frequently the target of vandalism
that confounds its numerous other technical problems. MIT's FloodSafe
Early Warning System employs inexpensive automatic radios to relay
water-level information from flood-gauge sensors within the river. Each
sensor system will cost roughly $2,000, compared to the
satellite-transmitting sensors, which cost $20,000 each. The radio sensors
can be repaired by local technicians, whereas the satellite-based devices
required U.S. trained technicians to oversee repairs. Due largely to the
absence of an early warning system, Hurricane Mitch killed almost 10,000
people and left more than 1 million homeless. The central achievement of
the new system is to eliminate the need for human monitoring of the river
level, as most tropical rains occur overnight, when there is no one
checking the water level. Instead, the radios sound an alarm relayed via
20-foot radio towers which, like the sensors themselves, are designed to be
impervious to vandalism. The students expect the system to be in place by
2006, and foresee its widespread use throughout the Third World if it
proves successful. Backers of the satellite system maintain that theirs is
more reliable than the students' system, as the radio towers could get
knocked down in a tropical storm.
Click Here to View Full Article
- "CMU Scientist Honored for Novel Method of Using Computers to Simulate Collisions of Objects"
Pittsburgh Post-Gazette (10/10/05); Spice, Byron
Carnegie Mellon's Doug James has developed a method for simulating collisions on a computer that operates a thousand times faster than
previous techniques. James, an assistant professor of computer science and
robotics, was recognized by Popular Science magazine as one if its
"Brilliant 10" young scientists, and has attracted the attention of
industry groups such as Pixar. In one application, 3,600 white plastic
lawn chairs cascade across the screen reminiscent of a waterfall in a
simulation that only took a couple of hours to produce. In conventional
graphics programs, tens of thousands of tiny triangles constitute an
image's shape, each of which have to be recomputed when the object collides
with something. That process is inordinately time-consuming, which
impelled James to look for a shortcut. Since most objects have a finite
number of shapes that can appear, James instructed the computer to pay
attention only to the critical points of an object's shape and orientation,
and only to the triangles that actually touch. The net increase in speed
is exponential, yielding potential applications both in computer graphics
and real-world navigation issues such as those that confound the robotics
industry. Seeking to answer his own question of whether you can "make a
virtual environment that is truly immersive," James has turned his
attention to complementing his collision simulations with comparable
applications for touch and sound.
Click Here to View Full Article
- "Artificial Intelligence Helps Make Life Easier"
Red and Black (10/06/05); Caldwell, John
The definition of intelligence may need to be reconsidered, because many people continue to disbelieve in artificial intelligence, says Don Potter, director of the Artificial Intelligence Center and a professor of computer science at the University of Georgia. "Maybe there's a bigger, better idea of 'what is intelligence,'" says Potter, who is involved in research for
the U.S. Department of Agriculture on genetic algorithms, which can
accommodate more variables than the human brain. Potter works with a
program that is able to model the evolution of the ecosystem better than
human observers. "The amount of artificial intelligence an average person
touches in a normal day is astounding," he says. Computers are now able to
discover stock market trends that have eluded their program creators, and
some new cars have the ability to learn the habits of the driver in order
to get better fuel mileage. Several years ago, researchers developed a
virtual assistant, complete with cameras, sensors, and a glove that
delivered signals to a blind and wheelchair-bound student who needed help
getting around campus.
Click Here to View Full Article
- "Why Kids Aren't Going Into IT"
InformationWeek (10/03/05); Wagner, Mitch
Businesses that hire information technology managers have to share some of the blame for the lack of interest college students are showing in
tech-related careers. The number of students in U.S. colleges and
universities pursuing degrees in computer science today is no longer enough
to fill the IT jobs that are available. For the past five years, IT
mangers have slashed wages, laid off workers, and outsourced jobs to India
and China. As a recent Cap Gemini Ernst & Young study shows, companies can
pay workers in these markets substantially less than what they would an
American IT professional. Meanwhile, last year there were 25 percent fewer
IT jobs than there were in 2000. Although InformationWeek senior executive
editor Chris Murphy notes that IT recruiting begins at home, the quotes
from IT professionals show that they do not always have good things to say
about the industry today. "I would recommend that my children look for
skills and an occupation that can last them a lifetime [40-plus years] and
not be stolen away from them by a cheaper worker/industry changes," said
Erin Wells.
Click Here to View Full Article
- "Europe Telecoms Object to EU Plan for Policing Web"
Dow Jones Newswires (10/06/05); Miller, John W.
The European Union (EU) is attempting to curtail the United States' influence on Internet governance, according to the European
Telecommunications Network Operators Association, an organization that
represents 42 top telecom companies in 35 countries. "I've been getting
urgent calls from our members, and they are upset" at the EU's latest
Internet governance proposal, says the association's director, Michael
Bartholomew. The EU's latest plan calls for "an international government
involvement at the level of principles" in overseeing ICANN. Just five of
ICANN's 21 board members and less than half of its employees are American.
"ICANN understands the calls for further internationalization, but we're
very concerned that the Internet technical coordination should not become
the basis for politicization," says ICANN director Paul Twomey, himself
Australian. Carl Bildt, the chairman of Swedish telecom Teleopti, says the
EU's proposal is disturbing and goes "a long way towards the position that
a number of states headed by Iran had been advocating, opening for a
political control mechanism." EU officials have been quick to say the
proposal has been misinterpreted and that the EU is opposed to governments
being involved in the day-to-day functions of the Internet.
- "The Upgraded Digital Divide: Are We Developing New Technologies Faster Than Consumers Can Use Them?"
Knowledge@Wharton (10/04/05)
Companies must take care not to upgrade and replace technology products too quickly, lest their complexity overtakes the speed with which consumers learn to use them. In a new paper, Wharton's Robert Meyer and Shengui
Zhao, along with Singapore Management University's Jin Han, point out that
consumers' decisions to buy new products fall into a "paradox of
enhancement," in which the consumer purchases next-generation products
expecting to benefit enormously from new features, only to be discouraged
from using them because they are overwhelmingly complex. The authors say
consumers' purchase decisions are informed by overly optimistic beliefs
about the value of the latest upgrades. Conversely, the researchers note
that people who buy new and enhanced products to avoid product
deteriorationas opposed to obsolescenceare less inclined to acquire
next-generation products. The complexity of products often compels
consumers to seek advice on purchasing decisions from a spouse, friends,
associates, salespeople, or published experts, according to Wharton
marketing professor Barbara Kahn. Their are various approaches companies
can take to draw customersspecifically, average usersto their products:
Branding is one strategy, while increasing consumer-support capabilities
is another. Wharton marketing professor Peter Fader recommends consumers
disregard new product upgrades until they are used by early adopters.
Wharton IT director Kendall Whitehouse expects simplicity to become a key
factor in consumers' purchase decisions, and foresees the emergence of
products that are easier to use yet more sophisticated in terms of design
and capabilities; broader user interface standardization will be part of
this trend.
Click Here to View Full Article
- "Researchers Advance Versatile Tech"
The Pitt News (10/03/05); Medici, Andy
The University of Pittsburgh held a special event Sept. 28, 2005, commemorating the creation of the Radio Frequency Identification Center for
Excellence. Officials from approximately 30 organizations attended the
gathering and were treated to demonstrations of RFID technology. As part
of one demonstration, groceries embedded with RFID tags were purchased
without removing the items from a shopping cart. "We walked it past a
checkout station, and everything was rung up without taking it out of the
cart," says Marlin Mickle, director of the center, who is also a professor
of electrical, industrial, and computer engineering at the university.
Mickle says there are numerous possibilities for using RFID for homeland
security, medical research, and communications, but more research needs to
be performed before it is applied in certain, life-critical circumstances.
The center consolidates the university's research in RFID, and Gerald
Holder, dean of the School of Engineering, said it "will be a powerhouse of
creativity and technological innovation that should lead to significant
improvements in the economy and simplify the lives of consumers."
Click Here to View Full Article
- "Devices Help the Blind Cross Tech Divide"
CNet (10/05/05); Singer, Michael
The Smithsonian Institution estimates that assistive technology constitutes a $5.4 billion market, which is being fueled by the aging populations of industrialized nations as well as a government initiative to address the
requirements of special needs groups. Examples of recent assistive
technology products include software from Eatoni that enables visually
impaired users to read email on cellular phones, and an AgentSheets device
that helps the handicapped use public transport by tracking GPS-equipped
buses, notifying users when the bus they want is coming, providing audio
and visual cues to help them board, and telling passengers when they have
reached their destination. HumanWare's Trekker, meanwhile, is a GPS-based
talking personal guide for the blind that provides digital maps. There are
also refreshable Braille displays that allow the blind to read information
displayed on a computer screen. The World Health Organization says that
between 750 million and 1 billion people worldwide suffer some type of
mobile, cognitive, speech, vision, or hearing disability, while 2002 census
figures indicate that over 54 million U.S. residents are disabled in some
way, a number that will undoubtedly increase with the impending retirement
of baby boomers. Major tech companies developing or supporting assistive
technologies include IBM, Microsoft, Hewlett-Packard, Dell, and Apple,
while smaller companies such as Electronic Vision Access Solutions (EVAS)
derive much of their business from clients in the federal and state
governments. EVAS, for instance, has a contract with Dell to develop
customized PC technology for veterans participating in rehab programs for
the visually impaired through the Department of Veterans Affairs.
Click Here to View Full Article
- ""Next-Gen Net Seen at a Crossroads"
Network World (10/03/05); Duffy, Jim
At the recent Next Generation Networks 2005 conference, the hot topic of debate was whether the network of the future will be more of an offshoot of the current Internet or the public switched telephone network (PSTN).
Given that there is considerable overlap in the functionality of both, the
choice is not clear, and is indeed confounded by aging platforms that were
conceived long before the advent of applications such as wireless, video,
and message-based routing. The shakedown could ultimately cost carriers
and consumers and redefine their relationship with each other, as well as
fundamentally alter the business models under which carriers operate. The
telecommunications industry is placing its bets with the IP Multimedia
Subsystem (IMS) architecture that would supplant the infrastructure that
powers the current circuit-switched telephone network, and would also
support services such as voice mail, file sharing, and text messaging;
while IMS is backed by the telecom coalition Third Generation Partnership
Project, the IETF and other critics claim it gives carriers too much
license at the expense of consumer choice. The major alternative, the
Internet, is notoriously vulnerable to denial-of-service attacks and other
security breaches, causing some, such as MIT senior research scientist
David Clark, to call for a rebuilding of the platform from scratch. The
level of comfort most users feel with the Internet is a strong argument
against incremental revision of the current network. One alternative is
the NSF's Global Environment for Network Investigations (GENI), which
supports new research and deployment initiatives, inviting the gradual
evolution that until recently had seemed a dim prospect.
Click Here to View Full Article
- "Landscapes on the Computer"
Fraunhofer-Gesellschaft (10/05)
Average Internet users have taken to maps available online to gain aerial views of the earth. They are joining city planners, architects, public safety agencies, and other specialists in viewing virtual images of the
earth, as a result of the emergence of software programs such as Google
Earth and NASA WIND that can be accessed from the desktop of a standard PC.
While average Internet surfers have used two-dimensional map
representations to plan trips, the new programs offer three-dimensional
geo-data, which allows users to view geographical renderings that are more
closely aligned with the natural human perception of terrain. For cyclists
and hikers, the visuals can help them to determine the uphill and downhill
grade of a route, instead of having to calculate raw numbers. The
Fraunhofer Institute for Computer Graphics Research IGD offers the
CityServer3D software, and any browser that makes use of the Java 3D
plug-in can offer direct views of its data. Routes can be imported into
the software to obtain information such as gradients, differences in
altitude, and the lengths of the various areas of the route.
Click Here to View Full Article
- "White Hat, Gray Hat, Black Hat"
Federal Computer Week (10/03/05) Vol. 19, No. 34, P. 19; Arnone, Michael
When cyberattacks first appeared they were often the product of loosely
organized or independent hackers who viewed the disruption of networks or
Web sites as a path to earn prestige among the community and build a
reputation. Modern attacks increasingly stem from highly centralized
outfits, often commissioned by foreign governments or organized crime
syndicates with profit as their sole motivation. Because hackers typically
secure their computers better than most other users, the government and
commercial communities could stand to learn a lot from white and gray hat
hackers. White hat hackers, the professionals commissioned to scour a
system for vulnerabilities to bolster its security, have long used
penetration testing to search for weaknesses, and that method is enjoying
increased popularity in the federal government. The spate of government
compliance regulations mandating greater protection of data is rooted in
actual hacking incidents. Many military and intelligence agencies maintain
a lively exchange with the hacking community, tipping each other off to new
developments and threats, though some more traditional agencies, such as
the FBI and the Secret Service, are reluctant to enter into dialogue with a
community they mistrust and view as criminals. Hackers typically follow
the path of least resistance, meaning they will only hack as much as they
have to in order to access a network. Neither government nor industry taps
into the hacker community as much as they could, owing largely to the
misconception that all hackers are black hats. There is also often a
disconnect between the independent mindset of the hacker and the procedural
structure of government agencies and large corporations. As cybersecurity
becomes a more critical issue, though, government and industry must
approach it with the vigilance and tenacity that characterize the hacker
community.
Click Here to View Full Article
- "Competitiveness, Truth, and Today's Universalities"
InformationWeek (10/03/05) No. 1058, P. 88; Evans, Bob
As the fissure widens between the majors college students are selecting and the skill sets in the greatest demand by employers, universities rarely shoulder the blame, as more popular targets include offshore outsourcing,
the dubious future of IT as an industry, and a declining pay scale.
Universities claim to be in the business of demand fulfillment, and insist
that the choice of major should be left up to the students. For his part,
Bill Gates conducted a speaking tour at five universities where he
responded to a question about offshore outsourcing with the answer that he
felt it was important to keep IT development centralized, and vowed that
Microsoft would not pursue major foreign labor initiatives in his lifetime.
Since that speech was given a year and a half ago, however, Microsoft has
been significantly increasing its hiring activities in India, owing to a
mounting shortage of qualified U.S. graduates. Critics contend that as the
global marketplace has become increasingly competitive, with China, Japan,
and Europe emerging as real superpowers, the U.S. university system has
abandoned its responsibilities to produce the most capable graduates and
future innovators in favor of vacuous multiculturalism initiatives. Hoover
Institution fellow Victor Hanson argues that university presidents have
succumbed to campus orthodoxy and forsaken their primary commitments to
honesty and truth. One possible, if unlikely, solution would be to tie
university funding to the school's demonstrated ability to produce
graduates with the required skills to comprise the next generation of
innovators needed to preserve U.S. global hegemony.
Click Here to View Full Article
- "Double Identity"
CIO Insight (09/05) No. 57, P. 33; Rothfeder, Jeffrey
Recent incidents in which prestige data management companies lost sensitive data through theftor more likely, mishandlinghave shaken consumers and security experts alike, and spurred U.S. lawmakers to propose bills that introduce strict regulation. The most sweeping proposal is the Personal Data Privacy and Security Act of 2005, which sets severe monetary and criminal penalties for companies that compromise or fail to adequately
protect personal data. One section of the bill mandates that companies
engaged in interstate commerce and possessing at least 10,000 digital files
on individuals must establish a data protection program guaranteeing that
sensitive records will remain confidential and personally identifiable
information will not be accessed without authorization. Publication of
data privacy procedures and regular testing of system security are also
required, and failure to comply with these rules could mean fines and
government prosecution for violators. Data encryption is the minimum
requirement for satisfying the bill's criteria, and companies that suffer
data breaches must immediately publicize them, unless they employ
encryption to make sensitive data illegible to information thieves. On the
other hand, Cyber Security Industry Alliance executive director Paul Kurtz
thinks third-party certification of data security programs could "drive
insurance companies to underwrite policies that cover losses for data
security breaches because they would have real data that could help them
determine risk." Another critical element for fulfilling the bill's
benchmarks is bolstered authentication of individuals as a measure to
prevent unauthorized people from accessing or downloading information. If
the effects of earlier data protection legislation such as HIPAA or the
Gramm-Leach Bliley Act are any indication, the Personal Data Privacy and
Security Act may very well be the driver for more responsible and secure
data management among businesses.
Click Here to View Full Article
- "Big and Bendable"
IEEE Spectrum (09/05); Chalamala, Babu R.; Temple, Dorota
Lightweight flexible electronics promises to deliver scores of long dreamt-of products, but the technology will remain a niche industry without
inexpensive fabrication techniques. Another challenge to broader flexible
electronics applications is the need for a plastic-compatible transistor
technology capable of switching millions of times a second, and the answer
may be found in amorphous silicon or organic polymers. The manufacturing
processes for plastic circuits are simpler than those for conventional
silicon ICs, which means plastic circuits can be theoretically fabricated
at less cost. Relatively cheap inkjet printing technologies can perform
the critical assembly processes for numerous kinds of flexible circuits,
and companies are working on solutions that involve piezoelectric and
thermal printing. Roll-to-roll processing of circuits on plastics is
another area of development, but neither it nor direct printing can produce
transistors that switch millions of times a second because of the limits of
the material, misalignment of circuit layers, the low melting temperature
of plastic, and the transistor size. The fastest transistors are those
made with single-crystal materials, whose generative temperatures can far
exceed the plastic melting point; to overcome this obstacle some
researchers are opting for conventional single-crystal wafers. Alternative
approaches include searching for plastic substrates that are less
heat-sensitive than substrates already in use, or materials whose
crystalline structure plays no part in switching speed. Flexible circuit
applications under investigation include roll-up displays, space-based
radar receivers, radio circuits that can be weaved into fabrics, disposable
radio-frequency identification tags, and lightweight X-ray imagers.
Click Here to View Full Article
- "Scholarly Work and the Shaping of Digital Access"
Journal of the American Society for Information Science and Technology (09/05) Vol. 56, No. 11, P. 1140; Palmer, Carole L.
Much of the discourse in the academic community pertaining to the future of scholarly communication centers on the methods of publishing and
disseminating articles. The digital production of access resources, such
as indexes, bibliographies, and directories, figures to have a more
sweeping impact on scholarly communication than even the electronic
publication of books and journals. Conventional access resources indexed
journal entries by discipline, usually providing bibliographic information,
with the possible supplement of domain-specific descriptors and abstracts.
Historically, it has been difficult to establish meaningful relationships
among printed materials, as cross-referencing efforts often yield results
that impede researchers. The digitization of full-text articles greatly
aided the process of mining scholarly repositories, as researchers
previously relied on digital abstracts and the bibliographies of other
books and articles. As scholarly literature consolidates into
interdisciplinary bundles, research libraries are now basing their
acquisition decisions on availability, rather than on quality and perceived
demand. Though scholars are relying increasingly on digital resources,
online access has not significantly eroded the popularity of the printed
journal. Scholars in the sciences and humanities rely most heavily on the
Internet for confirmation searching, discovery searching, collecting, and
consultation, or the process of collaborative research. The digital
assemblages scholars have created to inform their research can be
essentially divided into thematic collections, or bundles of research
centered around a common theme, and literature-based discovery tools that
search databases for relationships among literature. Though the tools to
navigate scholarly repositories must match the disciplinary interests of
the researcher, it is important to avoid the fragmentation that can come
with overly specialized searches.
|