Businesses Push for High-Skilled Foreign Workers
Wall Street Journal (04/06/06) P. B1; Kronholz, Jane; Rogers, David
As the Senate debates a major immigration bill that could determine the
fate of millions of illegal immigrants and thousands of specialized foreign
workers, U.S. technology companies are looking on with interest as the
future of a critical mass of skilled labor hangs in the balance. With the
debate dominated by the contentious issue of illegal immigrants, the
industry is afraid that the bill will get scuttled and that the question of
visas will remain unresolved. The lobbying efforts of businesses have
virtually assured that an increase in the visa cap for workers of every
skill level will be included in any compromise. By August 2005, employers
had already claimed all 65,000 three-year visas for skilled workers for the
fiscal year that began in October. With the 140,000 green cards issued
annually doled out evenly among sending countries, companies looking to
sponsor an employee from India or China can expect to wait five years
before the application is even read by the immigration service. Businesses
are particularly dissatisfied with the visa system's requirement that
foreign-born scientists have to leave the country upon finishing their
studies if a company is unable to secure them a visa. Worker shortages and
the uncertain future of foreign-born employees disrupts companies'
operations, often stalling or derailing key projects. The technology
industry persuaded Congress to triple the visa cap in 1999 to meet the
needs of the dot-com boom, though that provision expired in 2003, and two
years later Congress granted an exemption from the cap for 20,000
foreigners who held advanced degrees and were studying at U.S.
universities. Thanks to a re-emerging technology economy, those slots
filled quickly, and the industry is pushing for more. "It doesn't make
sense to educate this talent and send them to our global competition to
compete against us," said Intel's Patrick Duffy.
Click Here to View Full Article
- Web Link May Require Paid Subscription
to the top
Computing Students to Test Math, Programming
Prowess
TechWeb (04/05/06) Sullivan, Laurie
The Association for Computing Machinery will hold the 30th annual World
Finals of the International Collegiate Programming Contest (ICPC) in San
Antonio, Texas, next week. Teams of students from U.S. schools such as
Binghamton University, the California Institute of Technology, Carnegie
Mellon University, MIT, Princeton University, University of Maryland, and
Washington University will compete in the three-day competition. In all,
249 students from around the world will be present. "It's quite inspiring
to see this much raw potential sitting in one room," says Douglas
Heintzman, director of strategy for IBM's Lotus division, who calls the
event "the battle of the brains." IBM is the sponsor of the competition,
which will have the teams of three mathematics, physics, and programming
students solve eight real-world mathematical problems, such as "finding the
optimal configuration for the distribution of cellular phone towers in the
metropolitan area that is influenced by population density and obstruction
from buildings like mountains," says Heintzman. The computing students,
who will also build logic for interactive software games, will have an
opportunity to win $10,000 scholarships, ThinkPad computers, flat-screen
monitors, and other prizes.
For more information about the ICPC, visit
http://icpc.baylor.edu/icpc
Click Here to View Full Article
to the top
Rep. Boehlert Urges Full U.S. Science Funding
EE Times (04/06/06) Chin, Spencer
U.S. Rep. Sherwood Boehlert (R-N.Y.) continues to serve as a champion for
increasing funding for research and development, math and science
education, and engineering programs. His latest call came recently before
the Appropriations Subcommittee on Science, State, Justice, and Commerce.
In testimony before the subcommittee, Boehlert said the U.S. government
must not let budgetary concerns stop it from fully funding technology
initiatives. Boehlert, chairman of the House Science Committee, stressed
that U.S. competitiveness in the years to come will depend on its
commitment to technology. And he threw his support behind the American
Competitiveness Initiative (ACI), a plan by President Bush to set aside
$5.9 billion in the fiscal 2007 budget, and $136 billion over the next 10
years for investment in R&D, education, entrepreneurship, and innovation.
The National Science Foundation, the Office of Science in the Energy
Department, and the National Institute of Standards and Technology would
benefit from ACI. "You have a unique opportunity this year to set the
nation on a path that will keep us competitive and prosperous in the decade
ahead," said Boehlert.
Click Here to View Full Article
to the top
SIGGRAPH 2006 Announces Best of Show Award & Jury Honors
for the Computer Animation Festival
Business Wire (04/05/06)
The SIGGRAPH Computer Animation Festival will honor its latest Best of
Show Award and Jury Honors winners during SIGGRAPH 2006. The Computer
Animation Festival jury has announced "One Rat Short," by Alex Weil of the
United States, as the winner of the Best Show Award; and "458nm" by Jan
Bitzer, Ilija Brunck, and Tom Weber from Filmakademie Baden-Wuerttemberg in
Germany, as the winners of the Special Jury Honors. The awards were
created to honor the exceptional use of computer-generated imagery and
animation, along with a well-told story. Regarding "One Rat Short,"
Terrence Masson, SIGGRAPH 2006 Computer Animation Festival chair from
Digital Fauxtography, says, "The film's emotional tone, cinematography, and
technical realization all melded wonderfully into a simple yet touching
short film," which follows the journey of a rat from its world to a
futuristic laboratory. "Intricate details and subtle animation build layer
upon layer of simple elegance," he says of the Jury Honors winner, a story
about the romance of two mechanical snails. ACM SIGGRAPH is the sponsor of
the 33rd International Conference and Exhibition on Computer Graphics and
Interactive Techniques, which is scheduled for July 30 through Aug. 3 in
Boston. Approximately 25,000 computer graphics and interactive technology
professionals from all over the world are expected to come together for
programs on research, science, art, animation, gaming, interactivity,
education, and the Web.
For more information about SIGGRAPH, or to register, visit
http://www.siggraph.org/s2006
Click Here to View Full Article
to the top
To Packed Crowd, Speaker Discusses Cyber Security
Crisis
The Spectrum (04/07/2006) Halleck, Tom
Speaking at the University at Buffalo, cyber-security expert Eugene
Spafford criticized the government and private industry for a haphazard
approach to combating cyber crime. "We have people committing (cyber
crime) offenses again and again, but it's been calculated as less than five
percent of these crimes are prosecuted," Spafford said. Often the victims
of cyber crime are large companies reluctant to disclose that their
security has been compromised, while law enforcement in the area of
computer crime is still in its infancy. A major U.S. Army command center
recently scrapped all of its computers because of pervasive security
problems. It invested in a new, $36 million system that was reportedly
compromised in three weeks, Spafford said. While serving on the
President's Information Technology Advisory Committee (PITAC), Spafford
realized that no one was adequately addressing the problem of cyber
security. "What is Congress doing? They're stopping research and
development spending. The amount the PITAC asked for was an estimated $100
million a year. The U.S. spends that much in three days in military
operations in Iraq." While the government's response to cyber crime has
been lackluster, Spafford takes heart in the growing interest in security
among academic researchers. He also notes that public awareness of the
problem is slowly beginning to spread, though people continue to respond to
unsolicited email asking for personal information.
Eugene Spafford is chair of ACM's U.S. Public Policy Commitee;
http://www.acm.org/usacm
Click Here to View Full Article
to the top
HPCS: The Big Picture
HPC Wire (04/07/06) Vol. 15, No. 14,
In a recent interview, the Defense Department's Douglass Post outlined his
thoughts on DARPA's High Productivity Computer Systems (HPCS) program,
which has awarded funding to Cray, IBM, and Sun Microsystems to advance
supercomputing to the petaflop level while delivering systems that are
simple to use and program. The program aims to cut the time-to-solution
for both code development and production, while simplifying the complexity
of the computer's architecture to take advantage of increased power. The
vendors are trying to improve on the traditional Linux cluster to create
hardware and software for better floating-point and integer-arithmetic
computations. Productivity is a key part of the project, Post says, and
the developers are keeping the flops/dollar metric in mind as they create
applications that address real problems. A dedicated productivity team is
conducting case studies to identify bottlenecks in the development process,
and the vendors are creating new languages and tools to express highly
abstract parallelism. The project is also working to consolidate the
languages being developed by the vendors to create a single language that
will be adopted by the community. This summer, DARPA will select one or
two vendors to receive funding to bring a multi-petaflop computer to market
by 2010. MPI is a workable language for supercomputing, Post says, though
there is a vast potential market for a version that can perform at a higher
level of abstraction. In developing new languages, the vendors are keeping
market acceptance a priority, and are working to ensure that they operate
on multiple platforms, while the Argonne National Laboratory is leading the
effort to consolidate the research into a single language.
Click Here to View Full Article
to the top
The Fact Remains, U.S. Tech Leadership Must Be
Reinforced
Mercury News (04/07/06) Fuller, Douglas B.
While a recent study from Duke University contends that the education gap
in engineering and science between the United States and other countries is
overstated, the threat from China and other developing nations to U.S.
technology leadership is nevertheless quite serious, writes Douglas Fuller,
a postdoctoral fellow at the Stanford Project on Regions of Innovation and
Entrepreneurship. Even by the report's own figures, China awards 214,000
more science and engineering bachelor's degrees each year than the United
States, while the number of doctorates is increasing rapidly. Large
numbers of Chinese students are also earning their degrees in other
countries; in 2001, the number of Chinese students who earned their
doctorates in Japan, the United States, and the United Kingdom was
equivalent to 72 percent of the total number of doctorates earned by U.S.
citizens and permanent residents. China now produces 11 percent of the
world's science and engineering doctorates, while the United States' share
has fallen to 22 percent, roughly half of what it was in 1975. With
foreigners earning roughly half the computer science and engineering
doctorates in the United States, there is also the concern that U.S.
universities are training students who will return to their own countries
to work. As foreign countries continue to develop their own technology
industries, more workers will leave the United States to return home, just
as in the 1990s when as many as 100,000 Taiwanese scientists and engineers
left the United States to work in their native country's booming technology
sector. China and India appear to be following a similar pattern as their
governments are offering incentives to lure native-born workers back home.
China now ranks No. 4 in the world on the Georgia Institute of Technology's
Index of Technological Capability, having more than doubled its score in
the past 10 years due to increased patent activity and soaring government
spending on research and development.
Click Here to View Full Article
to the top
HP Labs India Shows off Tech Goodies
CNet (04/06/06) Olsen, Stefanie
Researchers at HP Labs India recently showcased several of their
inventions that could spread computing in underserved developing nations
with vast potential markets. In India, language is a major barrier to
technology adoption, as less than 10 percent of the population can conduct
transactions or write in English, and just 50 million people are PC
literate, executives at Hewlett-Packard say. HP Labs India has developed a
special keyboard for India's 14 national languages, though at present the
Gesture Keyboard only specializes in Hindi and Kannada. The device enables
users to write with a pen with handwriting-recognition software, digitizing
the gestures users make to consonants and distinguishing base consonants
from phonetic modifiers. HP Labs believes that the Gesture Keyboard will
lower entry barriers to computing for millions of Indians. While just 15
million people in India have access to the Internet, 600 million have
access to television, so HP Labs developed Printcast, a technique for
porting encoded files with television broadcasts so that users can encode
content that they have seen on TV into an MPEG 2 file that can be unwrapped
and printed. HP Labs has also developed a pen-based device for filling out
forms electronically that records the motions of a pen and uploads the
information to Hewlett-Packard's backend software. In another project, HP
Labs is developing a digital library of educational material called
Educenter, based on open-source software developed by the DSpace project, a
digital-library initiative undertaken jointly by Hewlett-Packard and
MIT.
Click Here to View Full Article
to the top
Berners-Lee's Next Trick: Creating a More Useful
Web
Network World (04/05/06) Brown, Bob
The impact of the Semantic Web could be felt in the business world within
the next couple of years, Web inventor Tim Berners-Lee told attendees at
the MIT Information Technology Conference. The Semantic Web aims to
improve the sharing of data and package information so that it can be more
easily understood by computers. "It's motivated by the data out there
that's not on the Web," he said. Berners-Lee compared the relationship
between the Resource Description Framework (RDF), a key technology in the
Semantic Web, and data to that between HTML and documents. RDF is
supported by XML, universal resource identifiers, and less celebrated
technologies such as SparQL and OWL. Berners-Lee is finding the Semantic
Web a tough sell, just as he struggled to explain the value of the original
World Wide Web. Security is still a major issue for storing data on the
Semantic Web, though Berners-Lee feels that if he can achieve better
protection, then companies will realize a greater return on investment from
their data. Berners-Lee advised the attendees to begin exploiting RDF as
they model their data over the next couple years, and said that companies
should demand RDF data sharing from their partners by 2008. In 2009,
developers should begin creating new programs over top the Semantic Web
framework, and by 2011, companies should be starting to discontinue the use
of some legacy programs, Berners-Lee said.
Click Here to View Full Article
to the top
A Pretty Good Way to Foil the NSA
Wired News (04/03/06) Singel, Ryan
Phil Zimmerman, author of the PGP email encryption program, has developed
an open-source software application to secure Internet phone calls. Zfone
is currently only available for OS X and Linux, though a version for
Windows is expected this month. The program encrypts and decrypts voice
calls as traffic moves in and out of the computer, and does not require
users to predetermine an encryption key or enter lengthy passwords. Zfone,
which has already been tested with X-lite, Free World Dialup, and the Gizmo
Project, is intended to be compatible with any VoIP client using the
standard industry SIP protocol. During the call, the software displays a
three-character code for each caller to read aloud to defend against
man-in-the-middle attacks, where eavesdroppers intercept the cryptographic
keys between two callers. If someone is attempting to intercept the
communications, the spoken codes will not match what appears on the
callers' screens, and they will know that someone is attempting to listen
in. Zfone is based on the SRTP system that adds a 3,000-bit key exchange
to the 256-bit AES cipher to generate the three-character codes that users
read aloud to each other. The protocol has been submitted to the IETF for
standardization. Zfone is intended principally to compete with Skype's
proprietary encryption system, which is not available for peer review and
is alleged to contain demonstrated vulnerabilities.
Click Here to View Full Article
to the top
Converting Light Wavelengths Within Fiber
Technology Review (04/05/06) Greene, Kate
Researchers have discovered a method of converting 1,550 nm light
wavelengths using fiber so that they are compatible with applications that
could benefit from sophisticated telecommunication devices, but that
operate at different wavelengths, such as electronic displays, biomedical
lasers, and air and sea communications. "People would like to be able to
generate, transmit, and detect electromagnetic radiation at different
wavelengths," said Lucent's Colin McKinstrie. "This experiment is exciting
because it shows that you can convert radiation efficiently between widely
different wavelengths." Led by Stojan Radic, a professor of electrical
engineering at the University of California in San Diego, the researchers
demonstrated that wavelengths of light between 1,541 nm and 1,560 nm could
be exploited to produce discernable green light with wavelengths ranging
from 515 nm to 585 nm. Usually, modulators are used to convert wavelengths
outside the optical fiber, though working within the fiber offers a more
reliable, faster, and cheaper conversion. To convert the signal, the
researchers used photonic crystal fiber and mixed wavelengths of 1,550 nm
and 800 nm to generate extremely intense light. The intensity forces
interaction among the light waves and the fiber, counter-intuitively
producing an amplified beam of 1,550 nm and a novel beam of 515 nm. While
traditional fiber can support conversion as well, the researchers achieved
the effect with just 20 meters of photonic crystal fiber, while it would
take miles of normal fiber. The researchers began the project with an eye
for improving submarine communications, though they are now considering its
application to the development of a surgical laser and new electronics
displays. A tunable laser that addresses every pixel in the display could
be paired with a universal band translator to synthesize any color in its
pure form, potentially tripling resolution, Radic said.
Click Here to View Full Article
to the top
Portland Project Ties Together Gnome, KDE
IDG News Service (04/04/06) Martens, China
The open-source Portland Project has developed software to integrate KDE
and Gnome, the two principal Linux desktop environments, which should
accelerate the spread of the operating system now that developers do not
have to decide between two incompatible interfaces when programming. The
Portland Project presented the software at the LinuxWorld conference
jointly with the Open Source Development Labs (OSDL) and freedesktop.com.
OSDL CEO Stuart Cohen said the software would be a much-needed stimulus for
the open-source operating system. The Portland Project, which grew out of
a meeting of OSDL desktop architects addressing interoperability issues,
has created a suite of command line tools and a group of interfaces for
programming library applications called DAPI. Independent software vendors
are receiving the protocols for testing, and the final release of Portland
1.0 is due in June.
Click Here to View Full Article
to the top
Exploring the Digital Universe
eLearn Magazine (03/06) Korman, Ken
Web content quality is rapidly becoming a critical issue as content swells
and the Internet evolves, and the open-source Digital Universe network of
Web portals is working to address the issue through universally accessible
content that is technically advanced, free from advertising, and validated
by experts. There are three separate bodies contained within the Digital
Universe: The nonprofit Digital Universe Foundation, which hosts content
and houses its global "stewardship" program; ManyOne, a transitory
for-profit firm that develops and supplies the Digital Universe's
underlying software and technical services; and the ManyOne Foundation,
another nonprofit that will own ManyOne once the project's investors are
remunerated. Digital Universe Foundation President Bernard Haisch said the
Digital Universe will feature an encyclopedia that the public can
contribute to under the guidance of stewards, who will make the final
decision over what content should be allowed on the live site. To
reimburse investors and maintain the flow of quality content, the Digital
Universe intends to generate income through a "venture philanthropy"
business model, in which the company teams up with "socially responsible"
organizations (educational institutions, museums, etc.) to sell branded
Internet services to their existing members and customers. In this way,
Internet users could access content from trusted and familiar sources
rather than big media companies, while these institutions could donate
content. Among the people behind Digital Universe is Wikipedia co-founder
Larry Sanger, who became disenchanted with and critical of the online
encyclopedia he helped launch as contributors became more and more
disdainful of subject-matter expertise, and Web entrepreneur Joseph
Firmage, who tried to create a widely accessible repository of scientific
knowledge modeled after the Encyclopedia Galactica, which served as a
precursor of the Digital Universe. Firmage's initiative aligned well with
Sanger's view that the best content is largely generated and rigorously
vetted by qualified experts, and also envisioned a bigger need to bring
depth and meaning back to the Web.
Click Here to View Full Article
to the top
Open Source Code Fix Project Scores Early Success
Computer Business Review (04/04/06)
Open source developers have made significant improvements in identifying
and correcting vulnerabilities in top open source projects, according to an
analysis of source code published by Coverity. The average defect rate for
the 32 projects studied has fallen to 0.231 per thousand lines of code,
compared to the baseline of 0.434 in early March. In the LAMP (Linux,
Apache, MySQL, PHP/Perl/Python) stack, the Linux kernel has gone from 1,062
defects to 782, followed by Apache going from 32 to 24, PHP from 204 to 42,
Perl from 89 to 68, and Python from 96 to 14. And projects such as Amanda
backup and recovery, Samba file and print server, XMMS (X Multimedia
system), and OpenLDAP have seen their defects decrease to zero. Since the
initial report, a software defect has been fixed every six minutes,
according to Coverity. The company performed the analysis for the
Vulnerability Discovery and Remediation Open Source Hardening Project,
which is backed by the Department of Homeland Security.
Click Here to View Full Article
to the top
Can Congress, Top Court Fix Patent System?
EE Times (04/03/06)No. 1417, P. 1; Merritt, Rick
The ailing U.S. patent system is in need of reform, but just how much
reform is a contentious matter; the Supreme Court, Congress, and the U.S.
Patent and Trademark Office are pursuing separate efforts to address the
issue this year, while large companies, small companies, and individuals
have differing opinions on what should be changed. Indeed, warring
philosophies in the tech community are likely to stymie any sweeping patent
system reforms for the time being. Legislation currently being drafted by
Sens. Orrin Hatch (R-Utah) and Patrick Leahy (D-Vt.) supports amending the
patent system so that patents that are filed first rather than invented
first are legitimate, a measure that many inventors, small companies, and
drug firms oppose on the grounds that it would give large companies with
big patent departments an unfair advantage. The patent office, which plans
to hire 2,000 more examiners in 2005 and 2006 to address a massive
applications backlog, is calling for systemic changes to both the patent
granting process and the practice of defending patents. "I think we have
to be very careful not to undermine the patent system, just make sure the
people who try to 'game' the system don't get away with it--because we are
coming into a world where intellectual property is the most important kind
of property there is," says Deka Research and Development founder and
inventor Dean Kamen. The Supreme Court, meanwhile, will address the issue
of injunctions when it rules on the case of MercExchange vs. eBay in June.
The plaintiff, MercExchange, alleged in 2001 that the online auction house
infringed on three of its patents, but an injunction was denied by a
district court that was not within its rights to do so, according to a
Court of Appeals decision in March 2005.
Click Here to View Full Article
to the top
Exceeding Human Limits
Nature (03/23/06) Vol. 440, No. 7083, P. 409; Muggleton, Stephen H.
Inexpensive data storage and increasingly efficient technologies have led
to the use of automated data collection and processing techniques
throughout the sciences as researchers deal with exponentially growing
bodies of data. As climate-modeling and astronomical experiments continue
to fill massive databases, scientists are becoming increasingly dependent
on computational power to identify and analyze connections between
different datasets, bringing capabilities previously thought only to be
possible in theory into real experiments. Scientists are using
machine-learning methods from computer science, such as neural nets and
genetic algorithms, to mine data and produce new hypotheses automatically
as they look to develop new drugs or mitigate the effects of climate
change. Researchers are often hamstrung by incompatibilities between
different models, though computer scientists are developing new formalisms
that coalesce mathematical logic and probability calculus to form a kind of
probabilistic logic. With experiments having already proven the ability of
robotic scientists to conduct tests to distinguish between opposing
hypotheses, a microfluidic robot scientist with active learning and
autonomous experimentation capabilities could appear within 10 years.
Laboratories on chips already exist, thanks to computer-directed
microfluidics, and a similar scaling process could be applied to
robot-scientist technology. One such application, a sort of programmable,
chemical Turing machine, could perform a wide range of chemical operations,
preparing and testing compounds automatically. The microfluidic Turing
machine could also serve as a model for cellular metabolism simulations, or
as the basis for an artificial cell that could be used in drug testing.
Click Here to View Full Article
to the top
Beat Cybercrime, Switch to a Virtual Wallet
New Scientist (04/01/06) Vol. 190, No. 2545, P. 28; Biever, Celeste
To simplify the process of conducting online transactions, Microsoft is
promoting the concept of a virtual wallet, a collection of icons on various
Web sites that users can click to verify their age, billing information, or
other personal details without having to remember multiple user names and
passwords. The system should also improve security by eliminating easily
hacked passwords and subjecting common Internet transactions to the same
cryptographic protocols used in banking and government. "From a user
standpoint, it's really simple, it's fast, and it's much more secure," says
digital identity expert Drummond Reed. Microsoft intends to include the
required software in its next version of Windows, while the Eclipse
Foundation is developing a similar application for Apple and Linux systems.
The Internet was not built with the idea in mind that people would have to
verify each other's identity, and passwords have proven too easy for
hackers to crack. Microsoft's earlier attempt at a universal verification
scheme, Passport, failed amid concerns that the company would act as the
custodian for every consumer's identifying information. Credit card
companies and other third parties are responsible for guarding information
in Microsoft's new system, just as they are now. After a user registers,
the third party furnishes the Web sites with a digital certificate and the
user with a virtual card that enables him to obtain a digitally signed
certificate to proof his identity whenever necessary. Users access the
system, which creates public and private encryption keys, with a master
password that never leaves a secure section of the computer. The system
will not permit users to enter sensitive information on sites that it
suspects are spoofed. By requesting the user's computer to decrypt
information with its private key, the card issuer creates a digital
certificate, which it signs with a digital signature and relays back to the
authenticated site.
Click Here to View Full Article
- Web Link May Require Paid Subscription
to the top
Connecting the Dots
GeoWorld (03/06) Vol. 19, No. 3, P. 34; Pasierb, Timothy; Pehle, Todd
Government policy and decision makers are realizing that geospatial IT can
significantly help analyze data from various sources, and key to this
capability is interoperable geospatial processing and data sharing across
distributed, multi-vendor computer platforms enabled by specifications and
standards agreed upon by consensus standards groups. A service-oriented
architecture (SOA) can facilitate the upgrade and enhancement of a service
or service components without the need for a complete system overhaul by
creating an operating environment that makes the most of interoperability,
modularity, and scalability. An SOA's advantages include lower development
times and costs via standard, reusable elements, applications, and data;
closer ties between business requirements and IT infrastructure; fewer
integration costs and application development risks; the elimination of
redundant data and systems and consolidation of similar capabilities from
legacy systems through shared Web services; and a framework for composite
applications and an integrated enterprise. By coupling the SOA with
semantic Web technology, every layer of the architecture will be augmented,
and machines will be able to comprehend the meaning of Web services along
with the information stored and available to the enterprise, thus enabling
more efficient information discovery, automated information processing, and
automated "service chaining." The resource description framework plays a
fundamental role in semantic Web technology by specifying a metadata model
founded on making statements about resources as subject/predicate/object
expressions. Semantic technology makes conceptual queries possible, and
enriches the descriptions of enterprise-accessible Web services. A
semantically enhanced SOA will allow users to more easily identify the
correct information and unearth previously unknown information and
relationships, as well as offer a contextual, integrated methodology for
interacting with Web services and information.
Click Here to View Full Article
- Web Link to Publication Homepage
to the top
Demystifying UML
Embedded Systems Design (03/06) Vol. 19, No. 3, P. 22; Mellor, Stephen J.
Unified Modeling Language (UML) can describe systems standardized by the
Object Management Group (OMG) graphically and textually, and UML manifests
itself as a notation for common object-oriented concepts--as opposed to a
system analysis and design method--for embedded systems developers, writes
OMG participant and software development trailblazer Stephen Mellor. UML
enables developers to function at a higher level of abstraction and become
more productive, as well as visualize concurrent behavior. UML is
comprised of 13 diagram types, and the author lists the class diagram
(which shows classes, attributes, associations, and generalizations), the
state machine (which shows behavior over time in reaction to events), the
use case (which captures requirements according to system-user
interactions), and the sequence diagram (which shows synchronous and
asynchronous object interactions) as the most commonly used diagrams for
embedded systems. There are a variety of methods to integrate UML
diagrams, with no agreement on the optimal approach. Also, there are no
required links among diagrams prescribed by UML, nor any rule that there
has to be one state machine for every class. UML is most often used on an
informal basis (sketch), but is also employed to particularize software
structure (blueprint) and as executable models that incorporate action
language. Each of these usage methods facilitate distinct types of reuse:
Sketches chiefly visualize solutions and communication between people,
blueprints are good for design documentation, and executable models allow
separate reuse of the application and implementation. "Using UML doesn't
have to change your development process, but the introduction of UML is
often seen as an opportunity to make some changes," Mellor writes.
Click Here to View Full Article
to the top