5 That Almost Made the List of Greatest Software
Ever
InformationWeek (08/11/06) Babcock, Charles
In compiling a list of the 12 greatest software applications ever written,
there were some very strong candidates that just missed the cut, writes
Charles Babcock. In 1963, MIT student Edward Sutherland submitted a
doctoral thesis laying the theoretical groundwork for the Sketchpad
software that ultimately led to the graphical user interfaces in Windows
and the Macintosh. Sketchpad revolutionized the way that people navigated
the computer screen by treating it as a navigable window rather than the
sequential command lines that had characterized computer interfaces for the
previous 20 years. Sutherland was awarded the ACM Turing Award in 1988,
and while his work was an undeniable breakthrough, it was so far ahead of
its time that it was impossible to implement all of its ideas in software
at the time. Another close runner up was Smalltalk, a major breakthrough
to be sure, but one that has been eclipsed by Java's network-oriented
structure that will have much greater staying power as the Internet age
progresses. Though they have clearly revolutionized the types of
applications available to users, geographical positioning systems (GPS)
just missed the cut because those applications are largely made possible by
the availability of data. Previous programs had achieved the same type of
data interaction, just with less spectacular results. Video games are
perhaps the best representation of the development of the graphical user
interface. While the ability to create immersive texture-rich simulations
on a two-dimensional screen is remarkable, it is difficult to select the
one game that is the best embodiment of the technology. Finally, the
VMware ESX Server just missed the cut, despite having facilitated the
revolution in virtualization that has reshaped the world of Intel and AMD
servers.
Click Here to View Full Article
to the top
A Sentinel to Screen Phone Calls
Technology Review (08/14/06) Graham-Rowe, Duncan
Microsoft researchers have developed a technique for automatically
screening phone calls. The system, called V-Priorities, determines if the
caller is a friend, family member, co-worker, or stranger, and evaluates
the call's urgency by analyzing characteristics of the caller's voice. It
then decides whether to send the call through or transfer it to voice mail.
Though originally designed as part of a broader effort to ensure that
individuals do not miss important calls while in a meeting, the system
could help stem the growing tide of spam phone calls. Preliminary testing
found that the system can detect with 90 percent accuracy determining
whether or not calls were solicited. It accurately judged when calls were
personal or for business 75 percent of the time, and it judged personal
closeness with 84 percent accuracy. While voice spam is still relatively
rare, it is likely to increase with the growing popularity of
voice-over-Internet protocol (VoIP). In addition to increasing the volume
of voice spam, greater VoIP usage could also compromise corporate network
security as viruses and other malware could conceivably enter a network
simply by answering a phone. V-Priorities has three levels of analysis.
One level studies the prosody of the caller's voice, examining its rhythm,
syllabic rate, pitch, and pause length. The second level of analysis scans
for target words that could give away the purpose of a call. Finally, the
system considers metadata, such as the length and time of a message. The
machine-learning algorithm that powers the prototype of the system was
trained by analyzing 207 voice messages that one individual received over
eight months. Though the prototype was developed by analyzing voice mail
messages, the final version would actually answer a call and ask the caller
to identify himself. It is based on the same challenge response technique
that can screen for spam in email, though that technology has not been very
popular.
Click Here to View Full Article
to the top
Italy Wins World Cup of Programming
PC Magazine (08/11/06) Del Conte, Natali
The Italian team Even .ctor placed first in the software design category
in the Microsoft Imagine Cup, a competition that challenges student
programmers to develop real-life technology applications. One hundred and
eighty-one students from 72 teams representing 42 countries participated in
the event held last Friday. Poland's Piotr Marek Mikulski took first prize
in the algorithm category; Australia's Andreas Tomek won the IT category;
and Team Atomnium from France took the top spot in the programming contest.
"These projects demonstrate the power of software to address real-world
problems, and I'm so impressed by the high levels of technical innovation
that these students achieved in their work," said Microsoft Chairman Bill
Gates. "This year's Imagine Cup participants all share a commitment to
improving people's lives that is very inspiring. They represent the next
generation of business and technology leaders, and their creativity and
passion are reasons for us all to be optimistic about the future." More
than 65,000 students entered the first round of the competition. Now in
its fourth year, this year's Imagine Cup had a theme of "Imagine a world
where technology enables us to live healthier lives." Winners receive
$25,000 in cash and are honored at an awards ceremony in Delhi, India,
where the competition was held. Next year's competition will be held in
Seoul, South Korea.
Click Here to View Full Article
to the top
Your Life as an Open Book
New York Times (08/12/06) P. B1; Zeller Jr., Tom
Privacy advocates and industry analysts say a clear position on the
confidentiality of users' online search behavior must be made, for there
are currently no laws to restrict the exploitation of such data, which is a
highly desirable commodity for marketers, law enforcement agencies, and
academic researchers. "In many contexts, consumers already have the
expectation that information about their cultural consumption will not be
sold," notes University of California, Berkeley, research Chris Jay
Hoofnagle. "They understand that the library items that they check out,
the specific television shows that they watch, the videos that they rent
are protected information." AOL's inadvertent disclosure of hundreds of
thousands of users' Internet search queries last week is viewed by some
privacy proponents as a colossal blunder for the search industry comparable
to the Exxon Valdez oil spill. "This AOL breach is just a tiny drop in the
giant pool of information that these companies have collected," says
Electronic Frontier Foundation lawyer Kevin Bankston. "The sensitivity of
this data cannot be overemphasized." Legislative attempts to address the
problem have been waylaid by skirmishes between privacy advocates seeking
wide-ranging consumer data safeguards, and the financial sector, which
wants to evade burdensome legislation and override stricter state laws.
Meanwhile, Congress has been debating taking a cue from Europe and
requiring the telecom and Internet industries to retain consumer
communications records for a set period in case they are needed in law
enforcement inquiries.
Click Here to View Full Article
to the top
It's Science, Jim, But Not as We Know It
the age (08/14/06) Cook, Margaret
More positive female role models of women in science in television and
movies could lead to more women pursuing technical careers, claims Margaret
Wertheim, an Australian science writer. In "Pythagoras' Trousers: God
Physics and the Gender Wars," the 1997 book that brought her international
acclaim, Wertheim argues that women have been underrepresented in science
ever since its emergence 2,500 years ago. "There is a deeply entrenched
view in our society that maths is a masculine activity and that women are
not innately inclined towards it," Wertheim says, adding that boys usually
get more attention in math and science classes than girls. Wertheim
believes that math and science careers would be more appealing to
university students if they paid more, and that the situation could become
critical as baby boomers begin to retire. She is particularly concerned
about the number of qualified math and science teachers. Wertheim
dismisses as "bunkum" President Bush's proposal to spend $100 billion on
missions to Mars as a way to stimulate interest in science and math among
kids. "Instead, let's spend it on 100,000 science and maths teachers for a
decade and pay them $100,000 each," she says. "If we're serious about
getting more children into these subjects, then we must provide them with
good teachers."
Click Here to View Full Article
to the top
Computer Grid Aims to Predict Storm Surge
Computerworld (08/11/06) Thibodeau, Patrick
The Southeastern Universities Research Association (SURA) is gaining more
computing power for a computer grid that will be used for a number of
research activities, including storm modeling. The research universities
have obtained new servers that are expected to double the number of CPUs in
the heterogeneous environment on the grid to about 1,800, and increase the
computing power from about 3 teraFLOPS, or 2 trillion calculations per
second, to about 10 TFLOPS. SURA has spent the past two and a half years
building the grid, and it counts the Coastal Ocean Observing and Prediction
Program among its research initiatives. SURA is working to develop
forecasting models that will allow scientists to accurately predict a storm
surge 72 hours before it begins to approach. Currently, scientists are
accurate in their forecasts about 24 hours before a storm. "The real
challenge here is to be able to create a product far enough in advance of a
storm hitting the coast to actually take action," says Gary Crane, director
of IT initiatives for SURA. For example, the grid could help determine
when to lower New Orleans' Lake Pontchartrain flood gates. About 14 of
SURA's 62 members participate in the grid, which could gain more computing
resources for analyzing meteorological and oceanographic data as more
universities sign on.
Click Here to View Full Article
to the top
From Snapshot to Cover Model in a Single Click
New Scientist (08/12/06) Biever, Celeste
ACM's recent SIGGRAPH conference in Boston featured a presentation on an
algorithm that is able to enhance the appearance of a human face in a
photograph within a few minutes. Researchers from Tel Aviv University in
Israel developed the "digital beautification" algorithm, which does not
make substantial changes to the appearance of a person. Instead, Tommer
Leyvand and colleague Yael Eisenthal have designed the algorithm to make
subtle alterations to the photograph of a face in order to make the person
appear more attractive. The researchers have created a set of rules on
attractiveness for a software program, based on how people rated the
appearance of faces in approximately 200 photographs. The software
analyzed the images for distances between facial features and ratios such
as facial width between eye and mouth level to come up with the set of
rules, or the "beauty function," and another program was developed to apply
the algorithm to a facial image and then analyze the changes to determine
whether the alterations make the person appear more attractive.
Click Here to View Full Article
to the top
Do ICTs Improve Our Lives?
IST Results (08/11/06)
Drawing on an analysis of socioeconomic surveys, the IST project SOCQUIT
is attempting to determine whether information communication technologies
(ICT) actually promote social interaction and improve people's lives, or if
they run counter to social networks. "Take the television as an example.
It draws people away from real-life contact with family and friends, could
it be the same with computers and the Internet?" asks Jeroen Heres, SOCQUIT
project coordinator. To give policy makers a better understanding of the
effects of ICTs, the project sought to determine whether they have an
effect on people's ability to find a job, whether they allow people to work
past retirement, and whether the digital revolution has not yet reached
migrants. Conventional wisdom holds that with 90 percent of new jobs and
60 percent of existing ones requiring ICT skills, computers should greatly
improve people's ability to find a job. On the contrary, the study found
that social contact is the predominant influence on employment, and that
ICTs do not provide the employment benefits for older workers that many had
assumed they did. The SOCQUIT project also found that social inclusion has
a greater impact on employment prospects for older workers than training.
The SOCQUIT report argues that the greatest opportunity for ICTs to improve
personal well-being is through their effect on a person's social life,
though it warns that policy efforts in the short term would only widen the
digital divide, as the socially isolated would still be left behind, while
those with active social lives would enjoy strengthened skills.
Click Here to View Full Article
to the top
IBM, Pace Partner to Boost Students' Cutting-Edge
Computer Skills
Journal News (NY) (08/10/06) Alterio, Julie Moran
IBM is taking a more hands-on approach to its partnership with Pace
University. The Armonk, N.Y., based technology giant will assist the Ivan
G. Seidenberg School of Computer Science and Information Systems in its
effort to develop its curriculum for the study of the mainframe, Linux, and
other open source technology. IBM and Pace expect to better prepare the
school's tech graduates with the computer skills that employers demand of
young hires today. According to an IBM survey, 75 percent of CEOs around
the world say there is a gap in the skill level of their workforce. IBM
teamed up with Pace last year for a program that allows its personnel to
teach courses, mentor students and provide them with career advice, and its
research and development laboratories in Poughkeepsie, Yorktown Heights,
and Hawthorne to host students for field trips. "There's a very great need
for information technology professionals, and at the same time, fewer
students are choosing to study computer science," says Dean Susan Merritt.
"A program like this is terrific because it provides a lot of resources to
the students to be prepared for the very large need out there for computer
science and IT professionals."
Click Here to View Full Article
to the top
A Fundamental Look at DNSSEC, Deployment, and DNS
Security Extensions
CircleID (08/10/06) Huston, Geoff
The integrity of Internet-based applications and services is prey to the
corruption of the Domain Name Service's (DNS) operation, writes Geoff
Huston, chief Scientist in the Internet area for Telstra. Instances where
bogus DNS data is being passed off as valid can be identified through
DNSSEC, which is not public key infrastructure. DNSSEC specifies an
extension to the DNS via the classification of additional DNS Resource
Records that DNS clients can use to confirm the DNS response's authenticity
and data integrity, as well as authenticate the nonexistence of a domain or
resource type mentioned in a fraudulent response. DNSSEC lacks public key
certificates, revocation capability, and explicit identification of the
involved parties. DNSSEC's performance can be affected by a number of
issues, including the increasing average size of a DNS response message
because of the additional signature records tacked on to the response; the
complexity of DNSSEC implementation and the problems a DNSSEC-aware
resolver may have to contend with owing to expired keys or mundane zone
configuration errors; the employment of the activity of a small query
generating a large response as denial-of-service amplifiers; variable DNS
root zone key rollover; and the increasing size of the zone file due to
additional DNSSEC records. The DNS could theoretically support
applications related to key distribution mechanisms through the
introduction of signed data into the DNS. Possible candidates are public
key certificates in the DNS in the context of key distribution for IPSEC or
SSH. Meanwhile, the employment of such certificates in the DNS as a way to
supply additional data to help receiving domains spot certain forms of
email spoofing is another area of interest Huston cites. One of the
biggest obstacles facing DNSSEC deployment is the resistance to change that
is inherent in an extremely large system such as the Internet.
Click Here to View Full Article
to the top
An Experimental Study of the Coloring Problem on Human
Subject Networks
Science (08/11/06) Vol. 313, No. 5788, P. 824; Kearns, Michael; Suri,
Siddarth; Montfort, Nick
The behavior and dynamics of naturally occurring networks can be
influenced by the networks' structural characteristics, according to
theory. But there is difficulty gleaning connections between behavior and
structure by empirical analysis, because such analysis usually covers fixed
networks. Michael Kearns, Nick Montfort, and Siddarth Suri of the
University of Pennsylvania's Department of Computer and Information Science
ran an experimental study on six human subject networks attempting to solve
the graph or network coloring problem, which models environments where
drawing a distinction between one's behavior and that of one's neighbors is
preferable. The researchers determined that it was less difficult for
networks based on cyclical structures to solve the coloring problem than it
was for networks based on preferential attachment, while "small worlds"
networks could solve the problem with even less difficulty. Kearns,
Montfort, and Suri's research demonstrated that increasing the amount of
information provided significantly decreased the time it took for the
cycle-based networks to solve the problem, and significantly increased it
for preferential attachment-based networks. Exit polls asking the subjects
what tactics they used showed they frequently and independently adopted
strategies that included selecting colors that result in the least number
of local conflicts, and avoiding conflicts with neighbors with high
connectivity. "With further study, such findings may have implications for
areas such as information sharing across large organizations and the design
of user interfaces for complex systems for multiparty coordination," the
researchers concluded.
Click Here to View Full Article
- Web Link to Publication Homepage
to the top
Write as We Fight
Military Information Technology (08/14/06) Vol. 10, No. 7,Payton, Sue C.;
Herz, J.C.; Lucas, Mark
The Defense Department's software is rooted in proprietary systems, a
situation that makes the responsiveness and agility the military would have
if it followed an open acquisitions strategy virtually impossible, writes
deputy secretary of defense for advanced systems and concepts Sue Payton.
"If the boots-on-the-ground community is urged to 'train as you fight,' the
technology community that supports warfighters must similarly be urged to
code as we fight--not as a set of scattered assembly lines, but as a
robust, responsive network," she argues. Thrifty budgets and declines in
U.S. science and engineering graduates are conspiring to spur the DoD into
making fuller use of resources, which cannot happen unless the closed
software development model is discarded. The collaborative nature of
private-sector open source software development is a model the department
should aspire to, given its rapid software modification, a sprawling
community of contributing programmers, and free distribution, according to
Payton. Other benefits of open source development include faster and less
expensive deployment, a bigger and more transparent technical talent base,
and relief from the overhead of network architecture upgrades.
Furthermore, the chances of catching bugs and code defects and making
software secure and reliable are greatly increased by the rigorous scrutiny
open source software is subjected to. "In cases where the military pays to
develop software for its own use, as opposed to licensing pre-existing
software developed commercially, DoD needs to assert its legally
established government rights to view, access and modify code, and leverage
it across the department," Payton insists. A two-year roadmap for
incorporating open-source software development within the DoD was outlined
in an April 2006 document.
Click Here to View Full Article
to the top
Free to Be GPL 3?
eWeek (08/07/06) Vol. 23, No. 31, P. 18; Brooks, Jason
The release of a second draft of the GNU General Public License Version 3
touched off a firestorm of criticism from the open source community, mainly
centering around the accusation that Free Software Foundation (FSF)
director Richard Stallman is an impractical evangelist who subordinates
concerns about viable business models to the ideological defense of free
software as an inherent good. The charge is irrelevant, writes Jason
Brooks, as Stallman by his own repeated admission is an unflinching
advocate of free software as an end in itself. While many of the changes
to the license are clarifications of points covered in the last update in
1991, the proliferation of software patents and other issues which the FSF
feels the license needs to address have emerged since the last update. The
provisions in the update dealing with digital rights management are among
the more controversial. The issue centers around whether or not free
software that contains cryptographic signatures that restrict use can still
be considered free. Ultimately, the success of the GPLv3 will depend on
the quality of the software that developers and vendors release under it.
The FSF has the choice of moderating the language that has alienated
prominent members of the open source community, such as Linus Torvalds, who
has vowed not to move the Linux kernel to the GPL 3 in its current form, or
risk losing the participation of the prominent free software projects that
made the GPL an important force to begin with.
Click Here to View Full Article
to the top
I.T. Versus Terror
CIO (08/01/06) Vol. 19, No. 20, P. 34; Worthen, Ben
Data mining is the counterterrorism IT technology of choice for the U.S.
government and intelligence community, according to experts. "There is a
real fear of not going down this path, because if there is value you don't
want to be on the side that opposed [a data mining project]," notes former
deputy director of the Defense Advanced Research Projects Agency's
Information Awareness Office Robert Popp. The government has thus far
avoided viewing data mining in the context of IT value, preferring to call
the apprehension of terrorists all the validation the methodology needs,
according to former Homeland Security Department CIO Steve Cooper. Fred
Cate of the University of Indiana's Center for Applied Cybersecurity
Research maintains that "As far as the oversight process is concerned, it
is clear that [data mining to prevent terrorism] is a disaster." Data
mining experts argue that the government's antiterrorism IT strategy should
be rigorously analyzed in the same manner that corporate CIOs vet company
IT projects. Experts also recommend that the government avoid defining IT
projects--those involving data mining in particular--too broadly, citing
examples of systems such as Capps II and Secure Flight whose implementation
is repeatedly delayed and whose generation of false positives is
unacceptable. Still, there is a general consensus among data mining
experts that the technique can effectively fight terrorism, provided that
it is managed appropriately. Cate says, "There are some extraordinarily
smart people [working on data mining systems], and I would be hard pressed
to think that they are wasting their lives on something that doesn't
work...But one of the things [the Defense Department's Technology and
Privacy Advisory Committee] kept focusing on was that you have to be able
to show that it works within acceptable parameters."
Click Here to View Full Article
to the top
Displays of a Different Stripe
IEEE Spectrum (08/06) Vol. 43, No. 8, P. 40; Pollack, Joel
A biomimetic design approach to electronic displays can conserve a great
deal of power by supplying no more data than the eye can perceive and the
brain can take in, writes Clairvoyante CEO Joel Pollack. Of the three
types of cone photoreceptors in the human retina, there are more red and
green cones than blue cones, yet most flat-panel displays distribute red,
green, and blue color elements, or subpixels, in equal ratios and configure
them in either a stripe or delta pattern. The role of the blue subpixels
in helping the eye resolve images is negligible, so there is a lot of
waste. One power-saving strategy is the use of the Bayer pattern, which
includes additional subpixels in the color green and connects the green
subpixels diagonally. Clairvoyante's PenTile Matrix avoids the Bayer
pattern's color imbalance by reducing the size of all the subpixels, making
the green subpixels smaller than the red and blue subpixels, spinning the
pattern 45 degrees, and adding a white or clear subpixel; the result is
upgraded efficiency as color filters absorb wavelengths from the backlight.
The four-subpixel arrangement is still inefficient, so Clairvoyante
employed software algorithms to render a pixel with an average of just two
subpixels. This allows luminance and color to be represented in multiple
red, green, blue, and white combinations.
Click Here to View Full Article
to the top
An Impending Massive 3-D Mashup--Part III: Focus on
Future Development
GeoWorld (08/06) Vol. 19, No. 8, P. 28; Limp, Fred
The success of 3D geospatial applications relies heavily on
interoperability, and though many fields have developed strong "information
silos" with compatible data structures and standards, making those silos
interoperable in terms of nomenclature, structures, semantics, and
ontologies is key, according to Fred Limp with the University of Arkansas'
Center for Advanced Spatial Technologies. There are important standards,
specs, and data schemas that people dealing with 3D geospatial information
should keep in mind, including the Open Spatial Consortium's CityGML spec,
the aecXML specs for information interoperability, ANSI's Spatial Data
Standard for Facilities Infrastructure and Environment (SDSFIE), the
LandXML format, and the LAS format approved by the American Society of
Photogrammetry and Remote Sensing. The creation of standard 3D data
storage structures is a step toward realizing the storage, manipulation,
and analysis of 3D data within a "standard" database, notes Limp. New
developments he deems worthy of consideration include the creation of
geographic exploration software such as Google Earth version 4 and Skyline
Software's SkylineGlobe.com product; products that aim to incorporate
realistic 3D vegetation and natural landscapes into 3D applications and
geographic explorers; and products that can visualize subterranean
processes. Limp presents a mandate for 3D geospatial mashup that stresses
the geospatial community's need to comprehend the various technologies in
order to maximize its effectiveness. Among his recommendations is
increased familiarity with levels of detail, faster product production,
avoidance of "feature-itis," an emphasis on Keep It Simple Stupid (KISS),
an awareness of the community's existing strengths, and the bolstering of
"our capacities in data acquisition, spatial analysis, spatial data mining
and creating information where there was originally only data."
Click Here to View Full Article
to the top