Keeping the Trust While Under Attack: What State CIOs
Need to Know About Evolving IT Threats
Government Technology (06/20/06) Asborno, Kim
Information security threats such as identity theft, fraud, and malware
were discussed during a NASCIO teleconference, keynoted by Dr. Eugene
Spafford at Purdue University. More than $100 billion is spent every year
to fight these attacks. Government legislation such as the Real ID Act and
Help America make it even easier for information to be exposed on the Web.
Spafford weighed in on the alarming statistics and what proactive methods
need to be implemented to safeguard networks. "In 2003-2004 we saw about
4,000 vulnerabilities reported in those [commonly used software packages],"
said Spafford. "In 2005 it jumped up to about 4,600, and so far this year
we are averaging about 20 per day. That's an incredible load to try to
keep up with." Spafford suggested that vendors need to release more
products to businesses and the government that can be trusted and he
insists that firewalls are not an effective solution. Organized crime,
rather than terrorism is the biggest threat to the government and it is
getting worse in Eastern Europe and Africa, according to Spafford. A
long-term plan that consists of policymaking, education, and enforcement is
the best solution for businesses and government to fend off attacks, said
Spafford. Business and the government should consider how and where
information is being stored and limit connectivity. Eugene Spafford is
chair of ACM's U.S. Public Policy Committee;
http://www.acm.org/usacm
http://www.govtech.net/news/news.php?id=99943
Click Here to View Full Article
to the top
Tech Worker Group Files Complaints Over H-1B Job
Ads
IDG News Service (06/22/06) Gross, Grant
The Programmers Guild has launched its legal attack against U.S. companies
that advertise their preference for hiring H-1B workers, claiming that they
are in violation of the U.S. Immigration and Neutrality Act, which mandates
that U.S. jobs be available to U.S. workers. Since May, the group has
filed about 100 legal complaints to the Department of Justice, with plans
to file roughly 280 more in the coming six months, according to John Miano,
founder of the guild. "Abuse of the H-1B program has become so widespread
that companies apparently feel free to engage openly" in broadcasting their
preference for H-1B workers over their American counterparts, Miano said.
The complaints come as tech companies are lobbying Congress to increase the
annual H-1B cap of 65,000, claiming that H-1B workers are needed for
positions that cannot be filled by the U.S. workforce. The Programmers
Guild says it is going after companies with wording in their ads such as
"We require candidates for H1B from India," and "We sponsor GC [green card]
and we do prefer H1B holders." So far it has only targeted ads for
computer programmers, and has not yet compiled a list of the companies that
it claims are breaking the rules, though Miano says they are mostly
boutique operations. The Information Technology Information Council (ITI)
claims that despite the potential abuses, the cap still needs to be raised.
The ITI's Kara Calvert claims that the 40 or so vendors in the trade group
are not violating the laws, and that companies that do should be punished.
Though the 65,000 cap has been met for the 2007 fiscal year, a sweeping
immigration reform bill that passed in the Senate is likely to raise the
annual limit to 115,000.
Click Here to View Full Article
to the top
ICANN Needs to Clamp Down on Domain Name Abuse
CNet (06/21/06) Isenberg, Doug
A debate over the purpose of the Whois database is quietly taking place,
with one side arguing that the database is essential to conducting business
on the Internet and another side arguing that, for privacy reasons, domain
name registrants should not be forced to enter personal information into
the database. Meanwhile, ICANN, which meets in Morocco June 26-30, is also
pondering the issue. ICANN requires that domain name registrars collect
personal information about domain name registrants, including their names
and contact data, and enter it into the publicly accessible Whois database
so that cybersquatters, phishers, and other online crooks can be forced out
of the shadows and identified. Ensuring that the information in the Whois
database remains publicly accessible is important to protecting company
brands and, by extension, consumers on the Internet, but others argue that
the Whois database creates privacy risks. Some cybersquatters provide
false Whois information to registrars--the registrant of one particular
domain name is listed as "Meow," a cat--and it can be surmised that these
domain owners are up to no good. Many cybersquatters now call themselves
"domainers," and an entire industry of domain name "monetization" services
has allowed domainers to make money off of parked domains, many of which
are suggestive of well-known brands. These monetization services, along
with other dubious practices such as "domain tasting," are causing economic
damage to legitimate businesses, which must spend money and resources to
protect their intellectual property on the Internet. If ICANN decides to
place additional restrictions on the Whois system, these companies and
their consumers will suffer even greater harm, and the integrity of the
Internet will be compromised, writes attorney and WIPO domain name panelist
Doug Isenberg.
Click Here to View Full Article
to the top
Lab's Supercomputer Sets Speed Record
Inside Bay Area (CA) (06/22/06)
Researchers at the Lawrence Livermore nuclear weapons lab have set a new
record for software speed, using the world's most powerful computer to
simulate the quantum interactions of metal atoms at more than 200 trillion
calculations per second. The simulation modeled the behavior of 1,000
atoms of half-molten molybdenum, a piece of matter smaller than a DNA
strand and undetectable even under a microscope. Unlike traditional
scientific simulations that rely on physical equations, the Livermore
project delved into the curious properties of quantum mechanics as they
relate to electrons and subatomic forces. The project demonstrates the
potential of supercomputers to explore proteins, new strains of
semiconductors, and new nanotechnology materials whose behavior is largely
directed by the complex and sporadic behavior of their electrons. "The
electronics are really the key," said Livermore computer scientists Erik
Draeger. "How the electrons form bonds and how they interact determines
the properties of the material so when you make predictions, you can be
confident in them." Qbox, the software used in the simulation, was written
by former Livermore researcher Francois Gygi. Qbox was written
specifically for Blue Gene L, the Livermore supercomputer that consists of
131,000 processors. It took two years just to get the software to run on
Blue Gene L, as the researchers had to coordinate the thousands of
processors while working with 6,500 GB of data. That method of programming
will become more common, however, as chip makers look increasingly to
multiprocessor designs. "I imagine in five or 10 years when introductory
computer classes are taught, maybe in high school, people may grow up with
that sort of mental model of parallel computing," said IBM's John Gunnels.
The team managed to coordinate all the processors so that they ran at
better than half their theoretical peak power, a rare accomplishment for
most supercomputers.
Click Here to View Full Article
to the top
Girls Love Science at Tech Camp
Inside Bay Area (CA) (06/21/06) Mills-Faraudo, T.S.
HP Labs held its fourth annual Tech Camp this week, drawing 20
grade-school girls from the surrounding communities of Redwood City and
East Palo Alto. Taking on roles as scientists, designers, and engineers,
the young girls participated in a number of activities, from taking apart
computers to making GPS maps. HP sees the camp as a way to get more girls
interested in careers in science and technology. "We're trying to
introduce them to things that they haven't experienced in school yet or at
home," says April Slayden Mitchell, a software engineer at HP Labs.
According to the National Science Foundation, women account for only 18
percent of the scientists and engineers in the United States, and 20
percent or less of graduates with majors in computer science, engineering,
physics, and other related fields. "I think it's really cool that they're
[HP Labs] doing this event, because maybe girls think that science is just
for boys," says Montse Zamora, 13, an eighth-grader at Adelante Spanish
Immersion Elementary School. Volunteer teacher Nancy Baugher says more
emphasis should be placed on how science is taught, adding that more
hands-on activities would make the subject more interesting for both girls
and boys. For information on ACM's Committee on Women in Computing, visit
http://women.acm.org
http://www.insidebayarea.com/sanmateocountytimes/localnews/ci_3962265
Click Here to View Full Article
to the top
"Red" Whittaker: A Man and His Robots
BusinessWeek (06/26/06)No. 3990, P. 19; Arndt, Michael
When it comes to robotics technology, William "Red" Whittaker is a big
believer in designing a robot that will have a practical application.
Whittaker, director of the Carnegie Mellon University Field Robotics Center
that he founded in 1986, is widely viewed as leading the way in moving
robots from the assembly line and setting them free in the field. He has
gone from having robots tethered by command wires, such as the Remote
Reconnaissance Vehicle that delivered images from inside the contaminated
nuclear reactor at Three Mile Island in 1984, to designing robotic
technology that will allow a tractor to operate on autopilot in a field.
Deere makes use of the latter technology, which uses GPS signals and laser
scanners to determine the location of the tractor, in some of its high-end
models. At the machine shop of the Field Robotics Center, a rover built to
explore lunar craters and a laser-guided explorer for maneuvering through
mines are on display. "A vision without implementation is irresponsible,"
says Whittaker, 57. Last year, his team of two driverless vehicles
competed in a government-sponsored race covering 132 miles in Nevada's
Mojave Desert.
Click Here to View Full Article
- Web Link May Require Free Registration
to the top
Lost in a Sea of Science Data
Chronicle of Higher Education (06/23/06) Vol. 52, No. 42, P. A35;
Carlson, Scott
Purdue University chemical engineering professor James Caruthers warns
that without an improved methodology to store and find research data, "we
are going to be more and more inefficient in the science that we do in the
future." Librarians are being called upon to archive massive volumes of
scientific data, but this is a tough proposition because of cultural and
financial restraints: For one thing, researchers often guard their data
jealously, while traditional funding entities such as the National Science
Foundation are less used to bolstering infrastructure. Dean of Purdue's
libraries James Mullins was inspired to archive science data by his
precious experience at MIT, home of the wide-ranging DSpace archival
initiative. Purdue is practically the sole supporter of the archiving
project, even though librarians are collaborating with scientists and
technology personnel to apply for grants. The Purdue effort will not
involve centralized data storage; rather, the data will be distributed
across departmental servers, faculty members' hard drives, or the
multi-university-run TeraGrid. Caruthers says this will make researchers
more comfortable because the data will be close by, and he also recommends
using a low-cost, risk-free, and voluntary participation model. The Purdue
librarians will confer with researchers and analyze the data to generate
metadata, which will be posted in an publicly searchable online catalog.
The Purdue project aligns well with the "Towards 2020 Science Report," a
study by prominent scientists which concludes that big, centralized
archives "are dangerous because the construction of data collection or the
survival of one's data is at the mercy of a specific administrative or
financial structure; unworkable because of scale, and also because
scientists naturally favor autonomy and wish to keep control over their
information."
Click Here to View Full Article
to the top
Of Different Minds About Modeling
SD Times (06/15/06)No. 152, P. 27; DeJong, Jennifer
Almost a decade after its creation, the meaning of the Unified Modeling
Language (UML) has widened, as demonstrated by discussions with five
modeling experts on UML. IBM's Bran Selic and Microsoft's Jack Greenfield
foresee a transition to domain-specific languages (DSLs) and the support of
code generation, model execution and verification, and other model resident
metadata applications. Telelogic's Jan Popkin expects modeling to "remain
the accepted communications vehicle," while PivotPoint's Cris Kobryn has
high hopes about the long-term prospects of Model Driven Development (MDD)
and visual modeling languages while harboring concerns about the short-term
effects of UML 2.0 language bloat. Object Management Group CEO Richard
Soley not only expects UML 2.0 to be extended, but also anticipates the
definition of new UML-based standards. The experts offered various
opinions on how development teams can enhance software development by more
effectively exploiting UML-based tools: Popkin suggested using UML in an
increasingly strict manner; Selic advised developers to note the value of
using models and modeling tools; Soley observed that development groups are
primarily employing modeling languages as sketching languages; Greenfield
supported a switch to DSLs, whose tool extensibility and user interfaces
outclass those of UML profiles; and Kobryn recommended that developers
first evaluate UML 2.0 and UML-based tools' advantages and shortcomings,
and then find out how these technologies can streamline and automate parts
of the development process. The experts agreed that modeling is an
important tool for software security and regulatory compliance, with Soley
remarking that the ability to define business processes with UML enables
compliance, while Kobryn said MDD technologies can make the specification
of security and regulatory compliance services more precise. Among the
reasons the panel cited for why development teams are adopting UML tools
were traditional code-centered techniques' inability to support modern
software's complexity, team size, error rates, less tolerance for poor
quality, compliance, failure, long-term maintenance, and the need for
augmented productivity.
Click Here to View Full Article
to the top
RFID Tags: Driving Toward 5 Cents
EDN (06/08/06) Vol. 51, No. 12, P. 69; Murray, Charles J.
Radio frequency identification (RFID) tags have not reached the nickel per
tag price point partly out of manufacturers' hesitancy, since lower-priced
tags may be less capable than higher-priced ones. "We've been talking
about the mythical 5-cent price point for years. Is it possible? Yes.
But it may not necessarily be the type of tag you're looking for," says
Venture Development's Mike Liard. The upshot of the lack of enthusiasm for
pursuing 5-cent tags is the employment of current tags in previously
undreamed of applications while makers simultaneously improve RFID
technology and reduce costs at about 5 percent to 10 percent annually.
Experts expect RFID tags to be embedded in everyday items, while their
non-line-of-sight capability can thwart theft and forgery by facilitating
the gathering of location information without individual handling. There
is also confidence among experts that an "Internet of things," in which
nearly all conceivable items are networked together, will be facilitated by
RFID technology. This would allow the instant identification of all
products by anyone anywhere. Researchers expect everyday objects to
feature RFID via integration within the corrugate of cardboard boxes during
manufacture, instead of on sticky tags. MIT mechanical engineering
professor Sanjay Sarma believes RFID technologies will proliferate when
production volume hits a tipping point, reducing costs enough to encourage
RFID tagging of everyday objects. Sarma says, "The question now is the
tipping point. When do you get to the percentage that causes you to say,
'I'm going to put the tag inside the corrugate?' In the next year, we
could see it happen."
Click Here to View Full Article
to the top
Xerox Looks Into Role of Images on Decisions
Rochester Democrat & Chronicle (NY) (06/20/06) Rand, Ben
Xerox and several other companies believe imaging technology will one day
be literally everywhere similar to the vision that many in the high-tech
industry have for computer technology. A number of ubiquitous imaging
projects are underway at Xerox, which says the technology is an extension
of its effort to develop document-related technology that boosts
productivity. "The whole idea is to make documents smarter, and images are
a part of it," says Siddhartha Dalal, vice president and manager of Xerox's
Imaging and Services Technology Center in Webster, N.Y. The company has
developed technology that would allow a Web page to automatically reformat
itself for the display device that wants to load it, such as in full
graphics for a personal computer screen or links for a cell phone or PDA.
Another project involves allowing images to provide information on content
and layout to printers so that they can configure settings to provide the
best image and the most detail. The goal is take advantage of the
information from images and use it in interactions and decision-making.
The challenge of processing such data in digital form is more of a
financial nature than technological, says Charles Bouman, professor of
electrical engineering at Purdue University.
Click Here to View Full Article
to the top
Advancing Scholarship & Intellectual Productivity
Educause Review (06/06) Vol. 41, No. 3, P. 44; Hawkins, Brian L.
In a recent interview, Clifford Lynch, executive director of the Coalition
for Networked Information (CNI), discussed his thoughts on emerging
technologies and collaborations that are reshaping the information
landscape. CNI has partnered with the U.K. Joint Information Systems
Committee (JISC) and, more recently, the SURF foundation in the
Netherlands. Lynch touts the prospects of recent large-scale digitization
projects, though he cautions that it is not enough simply to make digitized
materials available, but that new systems must transform the way that
information is presented so that people can actually interact with their
cultural heritage. Lynch broadly defines institutional repositories as any
service at the institutional level that oversees intellectual works,
including the results of e-science and e-scholarship, and makes them
available to the community. Initiatives to create and implement
institutional repositories vary widely among different nations, Lynch said,
noting that countries have different visions of how centralized and
integrated their institutional repositories should be. Lynch is a vehement
opponent of digital rights management, and indeed argues that the name
itself is misleading, while the practice can actually violate people's
legal rights. Lynch also distinguishes between mass-digitization programs
and the Google Library Project. While Google's is one of the largest
projects in the world seeking to digitally preserve resources, and
certainly the best-known, it is scanning copyrighted materials, unlike the
efforts of the Open Content Alliance, the Million Book Project, and others.
Digitization is not limited to printed resources, Lynch says, adding that
there are several initiatives currently underway to preserve video, sound,
and images. The ability to search these collections will be a determining
factor in how Lynch's vision of human interaction with resources
materializes.
Click Here to View Full Article
to the top
An Impending Massive 3-D Mashup (Part II)
GeoWorld (06/06) Vol. 19, No. 6, P. 28; Limp, Fred
A analysis of the various techniques used to create and deliver 3D
products and data reveals that the domain for such products does not reside
in any traditional markets or business/technology sectors, but rather in
the space between them. As a result, people who devote most of their
attention to "core" business initiatives or current technology orientations
will probably miss out, which is why a shift in U.S. higher education is
called for. Most geospatial workflows use "standard" photogrammetric
solutions for their primary input source, but as the level of detail
increases, photogrammetric techniques are replaced by CAD systems and
traditional surveying methods; LIDAR, aerial-mapping systems, high-density
surveys, terrestrial photogrammetry, and CAD are currently the main sources
of 3D data. 3D data extracted from LIDAR is the most rapidly expanding
type of 3D data, while the aerial-mapping process is undergoing significant
changes through innovations such as multi-push-broom sensors on aircraft.
With a push-broom solution, each line of the sensor constitutes an
individual image for processing, which requires the use of specialized
software in order to combine the onboard GPS and Inertial Measurement Unit
data. High-density surveys are becoming more prominent as a source of
extremely large-scale 3D data, and the systems to facilitate such surveys
are consistent in that they generate point clouds rather than information.
Time-of-flight units are usually employed for larger buildings and city
elements, while triangulation systems are typically used for smaller
objects. Terrestrial photogrammetry is a "traditional" technique for
producing fine geographic details, requiring each surface or feature of a
structure to be visible in multiple images.
Click Here to View Full Article
to the top
Could the Internet Fragment?
IEEE Spectrum (06/06) Vol. 43, No. 6, P. 20; Minkel, J.R.
The use of native-script and alternative domain names could conceivably
balkanize the Internet, according to one theory of thought. Countries such
as China are pushing for the introduction of native script domains, but
there are questions about whether users of Roman alphabets would be able to
access such domains. "There shouldn't be any kind of local name that works
only in some places, from some ISPs," cautions Paul Vixie, one of the
architects of the domain name system. Alternative roots in particular have
some observers worried about a fragmented Internet, but former ICANN board
member Karl Auerbach indicates that there is little justification for these
fears. Any root system operator that failed to carry .com would lose
visibility, he says, adding that the substitution of another .com would
likely prompt a successful trademark infringement lawsuit. Syracuse
University's Milton Mueller scoffs at the idea that competing roots will
ever rise to the level of mainstream popularity, arguing that no matter how
popular they become, they will be little more than add-ons to the classic
top level domains. The challenge of coordinating several hundred roots
would be formidable, but not as hard as some Internet purists say it would
be, according to Mueller.
Click Here to View Full Article
to the top
Debugging ZigBee Applications
Sensors (06/06) Vol. 23, No. 6, P. 16; Wheeler, Andy
The complexity of ZigBee wireless sensor networks lessens the
effectiveness of traditional debugging methods, but new tools are emerging
that can help. Greater numbers of sensors and a wider distance between
them makes the collection of information via standard techniques
increasingly cumbersome, which can give rise to inaccurate readings of
where a malfunction is taking place, or can cause new faults to crop up.
Most issues with a ZigBee HVAC system can be attributed to information
overload or the failure to obtain required information because of the size
of the system. Information overload can be minimized by network analyzers,
which come with traditional packet sniffer capabilities in addition to
support for multiple data sources and sophisticated packet activity
analysis tools. Replacing traditional in-circuit debug and serial printing
functions with analyzers calls for close links between the tools and a
vendor's hardware, and ZigBee nodes use MCUs and radios that feature direct
hardware support for network debugging. This allows for the creation of
tools that can resolve many problems associated with sniffer-based tools,
as well as the enablement of access to more traditional network debugging
methods; the debugging integration can be executed through the use of
ZigBee systems-on-chip. Among the challenges that are still unmet is the
provision of processor halt/step debugging functionality in a network, and
the creation of new techniques for the presentation and filtering of data
collected by debugging tools as network size expands.
Click Here to View Full Article
to the top
Operating Systems on the Rise
Embedded Systems Design (06/06) Vol. 19, No. 6, P. 53; Turley, Jim
Embedded Systems Design's annual survey of embedded systems developers
finds that around 28 percent of all embedded systems currently under
development will have no operating system (OS), and this absence is
especially prominent among developers of consumer, automotive, and
industrial electronics; conversely, computer peripherals are most likely to
feature OSes. A lack of need was the top reason provided by respondents
for not including an OS, followed by the pressure an OS would put on the
system's processor and/or RAM, cost, and difficulty of use. According to
the poll, OSes are more likely to appear in products at larger companies
than at smaller companies, while more experienced developers tend to use an
OS in the current project. Of the respondents who do use an OS, 51 percent
employ a commercial, off-the-shelf system; 21 percent use a proprietary,
in-house, or internally developed OS; 16 percent use an open-source OS; and
11.8 percent use a commercial Linux distribution. From these findings, it
can be surmised that the popularity of commercial OSes is growing
dramatically, and that such OSes are taking the place of in-house OSes.
Developers who opted for a commercially available OS said the choice of OS
was most heavily influenced by the software staff, although the software
manager also ranked highly as a decision maker. Top-ranking criteria for
assessing OSes include real-time performance, processor support, software
tool availability, a lack of royalties, cost, memory footprint, simplicity,
and middleware availability. The survey indicates a precipitous drop in
commercial and noncommercial distributions of open-source OS usage over the
last year, with poor performance and/or real-time capability, support
concerns, memory usage, legal ambiguousness, the state of development
tools, and price cited as reasons for the decline. Over 36 percent of
respondents said they would use a different OS in future projects than the
one they currently use, while around 63 percent said they would keep using
the same OS.
Click Here to View Full Article
to the top
DARPA Honors Decker for Work in Computers
UDaily (University of Delaware) (06/20/06)
DARPA has recognized University of Delaware researcher Keith Decker for
his contributions to complex computer organizations, particularly in the
methods by which large numbers of machines coordinate to solve problems.
Praising his "foundational research in generalized coordination
technologies," DARPA said that Decker's "superior research efforts and
vision fostered the development of a new paradigm which enables loosely
coupled distributed autonomous systems to work effectively together,"
noting the special significance his research has had on the Department of
Defense. His research is aimed at helping military units make coordinated
decisions in the field, with a parallel application for civilian responders
in emergency situations. Coordination is difficult amid the chaos of the
battlefield, Decker said, adding that he is attempting to develop computer
systems that can monitor the status of a plan and propose alternatives, if
necessary. The complex system is also intended to inform military
commanders of the impact that various setbacks can have on individual units
and furnish them with alternatives, and has an automated decision-making
function that could be extremely helpful when a unit comes under fire.
DARPA will test the technology over the next three years, at which point it
could contract the system to a manufacturer for further improvements and
production. Coordinating decentralized activities to solve problems is one
of the central challenges for both humans and computers, particularly in an
environment where circumstances change so quickly, such as a combat zone.
In addition to the obvious implications for military activities, Decker
sees applications in coordinating civilian activities, such as the botched
response to Hurricane Katrina where the competition between agencies,
policies, and authorities impeded rescue and recovery operations.
Click Here to View Full Article
to the top
43rd Design Automation Conference Announces Second
Integrated Design Systems Workshop
Business Wire (06/19/06)
A workshop on integrated design systems will be offered at this year's
Design Automation Conference (DAC) in San Francisco. The workshop is
titled, "How Can We Solve the Challenges of Design System Integration?,"
and top design system managers and design system providers in the industry
will discuss how to develop effective integrated design systems. ACM's
Special Interest Group on Design Automation (ACM/SIGDA) is a sponsor of
DAC, which has slated the workshop for two sessions to address the current
state of integrated design systems, and to discuss persisting challenges
and potential solutions. Design automation professionals who integrate and
develop design systems will have an opportunity to participate in the
discussion and ask questions during a closing panel. "Last year's workshop
drew attention to the need to streamline integrated design systems in
today's industry and was a very successful event," says John Darringer, an
organizer of the workshop. "We are happy to offer this workshop again this
year and look forward to discussing integrated design systems and coming up
with solutions to meet these challenges." The workshop is scheduled for
July 24, 2006, the first day of DAC. For more information on DAC, or to
register, visit
http://www.dac.com/43rd/index.html
Click Here to View Full Article
- Web Link May Require Free Registration
to the top