HomeFeedbackJoinShopSearch
Home

ACM TechNews sponsored by Looking for a NEW vehicle? Discover which ones are right for you from over 250 different makes and models. Your unbiased list of vehicles is based on your preferences and years of consumer input.    Looking for a NEW vehicle? Discover which ones are right for you from over 250 different makes and models. Your unbiased list of vehicles is based on your preferences and years of consumer input.
ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of either AutoChoice Advisor or ACM. To send comments, please write to [email protected].
Volume 6, Issue 650: Friday, May 28, 2004

  • "GAO: Fed Data Mining Extensive"
    Wired News (05/27/04); Zetter, Kim

    A May 27 General Accounting Office (GAO) report on federal data mining focusing on individuals found 199 active or planned projects across 52 agencies, and all but 77 of those projects involved Americans' personal data. The lion's share were coordinated by the Defense Department, and their goals ranged from the identification of criminals and terrorists to the analysis of scientific and research data. The GAO report listed helping agencies to improve performance or services as the chief reason for carrying out data mining, while the lowest number of data mining projects were concerned with intelligence analysis and uncovering terrorist operations. Fifty-four projects were listed in the report as using data provided by private firms such as credit card issuers, 36 of which involved names, Social Security numbers, and other kinds of personally identifiable information. In tandem with the release of the GAO's findings was a report from the Center for Democracy and Technology (CDT) and the Heritage Foundation that offered a three-step approach for developing and employing data mining without endangering personal privacy: The first recommendation called for a way to ensure the anonymity of individuals while commercial database producers and government agencies share information by using an intermediary. The second recommendation was to embed authorization requirements into federal systems for viewing data to guarantee that data is seen only by those who need to see it, and the third recommendation called for the incorporation of audit logs into the computing system to recognize and monitor improper access to data as well as misuse of data. "We can use technology to protect privacy as the government acquires more and more data and more data analytical capabilities," declared CDT executive director James Dempsey.
    Click Here to View Full Article

  • "Is IT Really a Man's World?"
    TechTarget (05/26/04); Hildebrand, Carol

    Sheila Greco Associates President Sheila Greco says the percentage of high-ranking female IT professionals in Fortune 1,000 companies has never exceeded 15 percent in the seven years her company has been tracking such figures, while Claudia Morrell of the University of Maryland Baltimore County's Center for Women and Information Technology reports that women are apparently being discouraged from entering the technology field as early as high school. Yet she adds that there are plenty of opportunities for technology-related careers despite mass layoffs in the tech sector and offshoring: The U.S. Department of Labor estimates that there should be 21.5 million high-tech workers by 2006, while about 2 million additional computer specialists will be added to the workforce between 2000 and 2010. "The whole area of creation, development and engineering is really where the U.S. is strong, and the area where women need to be," notes Morrell, who points out that women face virtually no competition. The shortage of female IT workers has spurred a more coordinated effort by hiring authorities to recruit women, observes Greco, who suggests women cultivate both technology and business skills in order to protect themselves against possible tech career shortcomings. Furthermore, the flexibility of computing-based careers is very appealing to women seeking to balance work and family life. Gloria Montana of The Anita Borg Institute for Women and Technology comments that "Out of all the math/science professions, anything that has to do with software lends itself to more flexibility because it opens up for the commuting situation." She also points out that IT jobs can enable workers to learn skills that can be applied across a wide spectrum of potential careers.
    Click Here to View Full Article

  • "Gone Phishing: Web Scam Takes Dangerous Turn"
    Wall Street Journal (05/27/04) P. B1; Wagstaff, Jeremy

    Phishing scams are not only increasing in number, but in deviousness as well: New phishing schemes not only employ convincing email addresses and fake Web sites, but also involve customized viruses that capture users' logins and passwords or take screenshots of login screens. Computer gangs mostly based in the former Soviet Union are sending out millions of phishing emails at an alarming rate, and the Anti-Phishing Working Group says there were 1,125 phishing incidents in April, 180 percent more than the month before; similarly, email security firm MessageLabs reports an 800-fold increase in phishing incidents over the last six months. The newest phishing campaigns appear to combine the expertise of virus writers, spammers, and socially intuitive con artists. Some banks have begun using pull-down screens for users to select password letters in an effort to avoid keystroke logging programs, but new phishing attacks now capture screenshots to defeat that defense. Australia and New Zealand are the target of many phishing attacks because the relatively small number of banks and banking customers there makes it easier to correctly target victims, and phishing groups operating overseas try to enlist locals to transfer money to their offshore accounts, offering those intermediaries a 5 percent cut. Often, these efforts are also scams, with phishing groups posing as international companies and offering intermediaries an official sounding "job," such as sales representative. Experts say phishing is still small-scale compared to other fraud activities, but it could undermine consumer trust in e-commerce and force companies to rethink their entire communication strategies.
    Click Here to View Full Article
    (Access to this article is available to paid subscribers only.)

  • "WWW2004 Semantic Web Roundup"
    XML.com (05/26/04); Ford, Paul

    The Semantic Web is in need of working applications, now that the technical foundations of RDF (Resource Description Framework) and OWL (Web Ontology Language) have been established. Both boosters and critics of the Semantic Web effort say applications need to get on the ground soon to determine the future of the technology. Tim Berners-Lee said as much in his WWW2004 keynote address, but presented the challenge positively to the gathered developers and researchers; he stated that the proof of the Semantic Web would not be just one application, but a working network of apps that is markedly different from previous connections. Many of the Semantic Web Presentations at ACM's recent WWW2004 conference used Hewlett-Packard's Jena storage framework for storing their RDF data, but an alternative is Kowari from Tucana Technologies. SIMILE (Semantic Interoperability of Metadata and Information in unLike Environments) provides an easy-to-use Web interface for RDF and is currently being used in several Semantic Web projects, such as one that allows inspection of W3C Technical Reports. Client-side Semantic Web programs include the Zakim IRC bot that complements teleconferences, and Bibster, which uses peer-to-peer links to manage and share bibliographic data. The University of Maryland described its SWOOP ontology browser that improves dramatically on the interfaces of previous ontology browsers, while perhaps the most impressive Semantic Web application was Haystack, an Eclipse-developed project that basically manages collections of information, such as bookmarks and email messages, and appears to be infinitely extensible. Many of the applications were written in Java, but while that technology is proven in the enterprise, it has yet to find wide use on the desktop: Haystack, for example, is a bulky 40-MB download requiring 512 MB of memory to operate, while alternatives include the .Net/Mono implementations, or Mozilla's XUL language.
    Click Here to View Full Article

  • "Search Engines Try to Find Their Sound"
    CNet (05/27/04); Olsen, Stefanie

    Consumers are demanding more and more online multimedia content as broadband connections spread throughout households, and this is spurring search engine companies to widen the scope of their information retrieval technologies to include video and audio, not just text. Nor are content providers sitting still: National Public Radio, for example, has been converting audio broadcasts into text files that have started to regularly appear on the index pages of Yahoo! News and Google News. The limitations of primary search engines to provide such content because of their reliance on text mining technology has encouraged the development of specialty search engines from Singingfish, Virage, Hewlett-Packard, and others. Some engines mine online text for content related to keywords, others analyze metadata that describes a multimedia file's content, and still others transcribe segments of the audio or video and examine the language for context. And a small number of engines aim to analyze audio and video features directly to search within files and uncover meaning and relevance. StreamSage labored for about three years on software that employs speech recognition technology for audio and video transcription, and then analyzes context to comprehend the language and dissect the content's themes, although experts claim the technology has an average accuracy rate of 80 percent. NPR uses StreamSage to transcribe audio content during broadcast, but NPR online director Maria Thomas notes that human transcription often replaces StreamSage transcription because of its greater accuracy. The radio network also tags its audio files with metadata using Singingfish technology, and a Boston affiliate employs HP's Speechbot to facilitate audio-to-text translation so that employees and visitors can conduct content searches on its own Web site.
    Click Here to View Full Article

  • "E-Vote Printers' High-Stakes Test"
    Wired News (05/27/04); Zetter, Kim

    Several U.S. states have enacted or are planning to enact legislation that requires electronic voting systems to provide paper ballots so voters can verify that their choices were registered accurately; Nevada will be the first state to deploy such a paper-trail system this year by using augmented Sequoia touch-screen machines in seven counties in the September primary and the November election. Steve George, a representative of Nevada's secretary of state, says the push to implement an audit trail is driven by voters' demands for one, in the wake of recent computer-science reports that called into question the security of e-voting. Whether the VeriVote printers attached to Sequoia's AVC Edge systems will be certified in time for the election is debatable, as Carson City County clerk reporter Alan Glover notes that the printer broke down during recent federal testing. Stanford University computer scientist David Dill recommends that Nevada subject the machines to rigorous testing, and says election officials should consider the use of the printers as an experiment. "If the printers fail, it will be because of problems with the implementation [of a paper trail], not problems with the concept," he explains. Despite the deployment of VeriVote printers in every Nevada county, around 70 percent of the state's voting population will not use them because Clark County, which includes Las Vegas, owns older, unmodifiable Sequoia systems; the county will need to set up at least one new printer-equipped system in each polling place. Furthermore, the printed ballots would be prohibited from use in a recount under state law because they do not comply with current specifications. Paper-trail standards have yet to be devised by the federal Election Assistance Commission, but California is expected to issue standards that the commission may ultimately adopt.
    Click Here to View Full Article

    For information about ACM's activities regarding e-voting, visit http://www.acm.org/usacm.

  • "To Quiet a Whirring Computer, Fight Noise With Noise"
    New York Times (05/27/04) P. E8; Eisenberg, Anne

    Brigham Young University physicist Scott D. Sommerfeldt has developed an active noise reduction technique to mask the annoying hum of computer cooling systems through a series of miniature speakers and microphones ringing the fan. The noise of the fan blades is detected by the microphones and other sensors, while digital processing and algorithms produce opposing tones emitted by the speakers. Using a noise reduction system built into the computer was a formidable challenge, because the fan's noise level could increase further away from the microphone. Sommerfeldt and his students placed microphones and speakers around a standard cooling fan in an aluminum casing modeled after a typical computer chassis in order to examine the sounds the fan made and model systems to suppress them. The noises that Sommerfeldt elected to mask are those produced by the fan's blades as they spin and push air past obstructions. A digital signal processor receives motion data from the sensors and noise readings from the microphones, and the algorithms adjust the opposing sound waves the speakers radiate to suppress the basic tone of the blades, along with associated overtones or harmonics. "We've taken the process a good step, but there are still improvements to be made," notes Sommerfeldt. The system effectively suppresses most of the fan's noise, but a faint hum lingers.
    Click Here to View Full Article
    (Articles published within 7 days can be accessed free of charge on this site. After 7 days, a pay-per-article option is available. First-time visitors will need to register.)

  • "'Heads-Up' Display Lives Up to Its Name"
    EurekAlert (05/27/04)

    Students at the University of Washington led by Eric Seibel of UW's Human Interface Technology Laboratory have spent four years developing the Wearable Low Vision Aid (WLVA), a head-mounted device that helps people with poor vision circumvent stationary obstacles by beaming icons into their retina. The low-cost prototype WLVA features a computer contained in a backpack linked to an image and display system outfitted onto a pair of glasses. A cluster of two-dozen light-emitting diodes and a camera comprise the imaging system; the diodes fire repeatedly while the camera captures infrared video input from the wearer's field of view, and the diode-illuminated scene is contrasted with the ambient scene in real time via software. The system can distinguish between closer objects and distant objects because more light is reflected off the former, and the system detects an imminent collision if the closer objects remain in the field of view and grow larger. The user is warned of the collision when the WLVA signals the computer, which determines where and what the obstacle is and triggers a vibrating crystal that projects a raster image onto the back of the user's eye. "Because audio is already the key sense of detection for people with vision disabilities, we chose not to add any audible cues and only augment the user's impaired visual system with more easily seen laser light," notes Seibel. A smaller WLVA is planned, which will include a scanner with dramatically higher resolution derived from a micro-endoscope. The latest version of the device was shown at the Annual Society for Information Display Conference in Seattle, Wash., on May 27.
    Click Here to View Full Article

  • "Homeland Security's Missing Link"
    Business Week (05/24/04); Salkever, Alex

    The United States government is undertaking numerous projects to use cutting-edge technology to secure borders and prevent terrorism, but critics such as Council on Foreign Relations transportation security expert Steven Flynn argue that too much homeland security capital is being poured into operations in Iraq while not enough is devoted to implementing effective and economic security measures closer to home. James Lewis of the Center for Strategic & International Studies explains that technology has not increased Americans' safety, observing that "We have tons of gadgets and pilot projects, but we haven't tied them together for real intelligence." For example, the specifics of the $15 billion U.S.-VISIT program, whose purpose is to identify foreigners as they enter the country and track their whereabouts using a combination of computer networks and biometrics, are vague. How various law enforcement agencies will share data, and how U.S.-VISIT will dovetail into notoriously uncommunicative government networks, are topics that remain unaddressed. Another concern of critics is that an overreliance on technology for analytical operations could undermine intelligence efforts. Expensive predictive data-mining projects designed to find likely terrorists by sifting through vast public and private databases have raised issues about their ethical appropriateness as well as their viability, with critics claiming that such systems will produce too many false alarms to be effective. Still, some efforts have started to pay off: Marking vehicles that regularly pass over the border with radio-frequency identification tags has lightened the load for patrol agents. In addition, the FBI has upgraded its computer system to transmit images or audio files over its networks, while FBI agents are parsing leads with modern data-analysis methods.
    Click Here to View Full Article

  • "New E-Learning Tools"
    NSTP e-Media (05/27/04); Devi, Chandra

    Malaysian software provider Zeddel will develop new e-learning tools in collaboration with Canada's University of Alberta and University of Saskatchewan. The joint project will be led by the Alberta Ingenuity Center for Machine Learning (AICML), whose concentration is curiosity-driven machine learning and industrial bioinformatics and interactive entertainment applications; and Saskatchewan's Advanced Research in Intelligent Systems (Aries) Lab, which focuses on intelligent tutoring systems (ITS) and adaptive learning environment research projects. "Via this partnership, we intend to encourage the intellectual collaboration of researchers who share common interests, to promote collaboration leading towards the successful application for joint funding to granting agencies...and thirdly, to allow exchange of research products," declares Zeddel Chairman Datuk Dr. Vincent Lowe. The project will also enable the partners to exploit and test their research results in real-world applications with industry collaborators and market them when feasible. University of Alberta computer science department chair Dr. Randy Goebel comments that Zeddel's enthusiasm to adopt leading-edge technologies and deal with making them market-viable spurred AICML's decision to collaborate. Not only that, but the center is required to interact with relevant industrial collaborators regardless of their physical location to sustain its scientific value by guaranteeing that its research is comprehended and extended to practical problems. Goebel states that Zeddel will need expert advice about how learning algorithms can be applied to ITS in order to carry out its ITS R&D strategy, and AICML and the Aries Lab will supply the scientific expertise to keep Zeddel's path clear regarding the use of machine learning in its R&D effort.
    Click Here to View Full Article

  • "Technology Could Help Drivers Make Right Choice"
    Los Angeles Times (05/25/04) P. B2; Buchanan, Joy

    Transit officials in Orange County, Calif., are looking at new technology that measures average vehicle speed and travel time in order to facilitate more effective traffic management and decrease the bottlenecks that slow down commuters. One technology would read a vehicle's speed and travel time by tracking the vehicle via its license plate image, which would be captured by infrared sensors at several points along the freeway. Microwave radar is another possible solution, albeit one that officials say would not be able to calculate speed or travel time along the whole route as accurately; encrypting license plate numbers would assure drivers' of their privacy, according to 91 Express Lanes general manager Ellen Burton. Meanwhile, the same technology used to collect tolls electronically could be employed to measure speed and travel time: Motorists would attach transponders to their vehicles that generate signals picked up by sensors along the toll lanes, and these signals could be read by new software as cars enter, traverse, and leave the toll lanes to determine their speed and travel time. This technique could also be used to gauge speed and travel time on the general lanes, since people with transponders do not always take the toll lanes. Following the collection of data, regularly updated travel speeds and times could be posted on signs along the 91 lanes, or be accessible to motorists via the Net or phone. The radio-frequency technology could carry a hefty $1 million price tag, but Burton cites its reliability and interoperability with state-wide toll systems and the transit authority's infrastructure as key selling points. The transit agency may also one day employ new technology to set tolls on a minute-by-minute basis, according to traffic congestion.
    Click Here to View Full Article
    (Access to this site is free; however, first-time visitors must register.)

  • "The Increasing Importance of Process Skills"
    EarthWeb (05/26/04); Spafford, George

    As the pace of technological evolution quickens, IT process skills are becoming more important, writes George Spafford. The data center of 20 years ago still used IBM CSIS software for online computing and systems programmers would have as many as 15 years' experience in that technology, while today two years' experience and a technical accreditation are enough to designate someone an expert; similarly, the IBM 370 mainframe replaced the seminal IBM 360 mainframe after 10 years, while today servers see an update about every two years with processor upgrades every few months. The speed of technical advance has left IT workers with far less time to hone critical process skills, such as change management. New technicians today may have the requisite knowledge of technology, but may lack an understanding of how their work affects business operations; they might, for example, patch a critical system during the middle of the business day without realizing the implications of that system crashing. It could be argued that 20 years ago was the golden age of enterprise IT as far as process maturity: Enterprise data centers at that time used IBM mainframes running the MVS operating system, and employed a select group of professionals to manage the centralized system. Roles were clearly defined, resulting in more secure and higher quality systems. With the introduction of minicomputers, and later desktop PCs, deployed by business departments, the trend of increasing process maturity reversed; today, people with little knowledge of processes can set up a data center. IT workers rarely have time to fully develop important skills as they have to constantly learn new technologies, and Spafford argues that putting such inexperienced and harried workers behind critical systems is like putting someone who has no driving experience or knowledge of driving laws behind the wheel of a Ferrari.
    Click Here to View Full Article

  • "Will Code Check Tools Yield Worm-Proof Software?"
    CNet (05/26/04); Lemos, Robert

    A report from the Business Roundtable blames buggy and vulnerable software code for most of the major cyberattacks and network breaches that have harried American consumers and businesses in recent years, and says these exploitable code errors stem from software development processes that lack effective testing, review, and safety measures. Though software is tested for flaws, usually the purpose of testing is to see if the software operates properly rather than if it fails when intentionally improper operations are performed. Static source code checkers originally developed by academic researchers to glean data about software flaws are being marketed by several companies as tools for spot-checking security. One such product was so well received by Microsoft that the computer giant bought Intrinsa, the company that sold it; the technology is now a key component of Microsoft's Trustworthy Computing Initiative, and Microsoft security program manager Michael Howard reports that Intrinsa's tools are used to regularly enforce discipline for developers. Fortify Software founder Mike Armistead notes that a commonly held attitude among software developers is that some errors will always be missed, and therefore it is acceptable to ship products and let others alert the developers of any flaws. But security researchers do not always disclose the flaws they detect, and many security experts think that developers could be held accountable for the glitches they fail to find, particularly if checking technology is available--factors that are raising the stock of automatic code error detection tools. Some people believe static source code checkers are not yet ready for commercialization: Immunity founder Dave Aitel perceives a need for such tools, but argues that current products generate too many false positives to be effective.
    Click Here to View Full Article

  • "Sensor Technology Comes in From the Cold"
    Innovations Report (05/27/04); Watts, Sarah

    The University of Southampton's GLACSWEB team has deployed a network of sensors within Norway's Briksdalsbreen glacier in order to monitor its behavior, which could yield insights into glacier dynamics and climate change. Southampton team member Dr. Kirk Martinez says the sensor probes wirelessly transmit readings related to temperature, pressure, ice movement, and the basal sedimentary layer from about 60 meters below the glacier's surface to a base station, which in turn sends the data to the Southampton server by mobile phone; researchers can access the data at www.glacsweb.org. Martinez notes that the system must contend with the vagaries of weather, noise, and loss of power and communications. "Our sensors are housed in 'electronic pebbles' which will behave as part of the glacier, enabling us to get the clearest picture possible of what is happening deep below the surface," he says. Martinez predicts that sensor webs will proliferate throughout the globe and provide a more precise view of environmental change. "In order to make successful sensor webs issues such as: communications, low-power, robustness and adaptability have to be solved through a combination of mechanical engineering, electronics, computer science and environmental science," he exclaims. The GLACSWEB project is a component of the Next Wave Technologies and Markets program funded by the Department of Trade and Industry. The goal of the program is to guarantee that British industry is capable of taking advantage of new information and communications technologies and products that facilitate embedded intelligence in life-critical devices.
    Click Here to View Full Article

  • "Publishing by Design: Time to Make Human Factors a Concern"
    Online Journalism Review (05/20/2004); Macdonald, Nico

    Nico Macdonald writes that insights into human-computer interaction (HCI) and design can solve many problems that currently limit the usability and appeal of digital mass communication, which is currently influenced by several false assumptions about convergence and traditional media. The concept of media convergence into one homogeneous device does not refer to the front end of Internet technology, as many think, but the back end, and Macdonald cites people's need for a heterogeneous front end. Experience with HCI proves that the PC/Web browser model is hardly the most convenient means of mass communication: Lack of portability, high costs, slow startup times, and unreliable broadband connections are just some of the disadvantages. Another fallacy posits that pre-digital communication models can be applied to the digital realm, which has led to online facsimiles of print and broadcast formats that offer confusing navigation and architecture. Macdonald notes that better models for mass communication can be made from existing technology, but an overemphasis on the technology itself has led to a paucity of easily understood, easy-to-use products. The author refers to an article by Andrew Zolli from the ACM's journal Interactions which notes that "the very thing that makes traditional mass communications platforms (such as television, radio, newspapers and magazines) so successful is how completely they have solved their respective interface challenges." Another HCI tenet overlooked by mass communication industries is the need for interaction design that satisfies individual users, while Macdonald advises new media publishers to not accept the falsehood that consumers will interrupt their busy schedules to absorb news at an online PC or interactive TV for hours at a time. There have been moves to address this issue with the development of context-applicable technologies such as "printable versions" of news stories, handhelds used as reading devices, and text-messaged alerts.
    Click Here to View Full Article

  • "Xerox: Peek at Tomorrow"
    eWeek (05/24/04) Vol. 21, No. 21, P. 35; Solheim, Shelley

    A May event sponsored by Xerox's Innovation Group was used to highlight display breakthroughs that could be combined into transformative technologies for the workplace. Xerox fellow Beng Ong was on hand to present a plastic transistor jet-printed with semiconductive polymer ink, an innovation that could potentially lead to lightweight and flexible electric paper, very thin portable monitors and TV screens, and less expensive large-area devices and low-end microelectronics. Bryan Lubel, CEO of Xerox subsidiary Gyricon, said that Ong's transistor could be a complementary technology for Xerox's SmartPaper. SmartPaper consists of a thin plastic material in which are contained microscopic beads suspended in an oil medium. Each two-sided bead features an individually charged black and white hemisphere, and the electronic stimulation of these particles--which can be done wirelessly and remotely--configures them into words and pictures that remain even after voltage is cut off. SmartPaper-based signs are energy-efficient, battery-powered, and interoperable with Wi-Fi, and Lubel believes the integration of SmartPaper with Ong's polymer transistor will support the development of rollup electronic displays. Xerox Innovation Group CTO and President Herve Gallaire thinks these technologies will transform the office into a highly distributed, highly networked environment. He predicts, "People will be connected with devices we know today--but many more of them--and they will be connected a lot more with their PDA [or] their phone."
    Click Here to View Full Article

  • "Going Mainstream"
    InformationWeek (05/24/04) No. 990, P. 34; Greenemeier, Larry

    Linux is spearheading open-source software's penetration of strategic business processes and systems: An InformationWeek Research poll of 420 business-technology professionals estimates that almost 70 percent of respondents use the Linux operating system, compared to 56 percent a year ago; the software's most popular business applications include the operation of Web or intranet servers, application development, database management, and email and message hosting. Demand for Linux is being fueled by the support of major vendors such as IBM, Dell, and Hewlett-Packard, who have invested in the operating system as part of their strategy to boost sales of Intel-based servers. Rackspace Managed Hosting vendor Dirk Elmendorf points out that the economic downturn has also benefited Linux, in that it forced people "to start thinking about how they were going to change their business to survive in a new environment." About 80 percent of the survey's Linux-using respondents cite Linux's low cost and insignificant licensing fees as key reasons for choosing the software, and Rebecca Wettemann of Nucleus Research says the operating system's cost-effectiveness is mostly derived from its ability to further leverage existing equipment. Elmendorf contends that Linux's expansion into the enterprise sector stems from solid returns in smaller environments. However, 23 percent of survey respondents list Linux's incompatibility with enterprise apps, a shortage of internal expertise, and a strong aversion to heterogeneous operating-system environments as reasons why they have elected not to use Linux in the next 12 months. Thirty-two percent cite a lack of trustworthiness in open-source software, a situation that may have been exacerbated by the targeting of Linux users, licensors, and distributors in SCO Group lawsuits. Linux Professional Institute President Evan Leibovitch notes that the rollout of open-source applications for vertical industries is proceeding at a slow pace.
    Click Here to View Full Article

  • "Sparking the Fire of Invention"
    Technology Review (05/04) Vol. 107, No. 4, P. 32; Schwartz, Evan I.

    Microsoft Research founder and former Microsoft CTO Nathan Myhrvold contends that a focus on mission rather than invention is an attitude that both major corporations and small companies suffer from, and he and former Microsoft chief software architect Edward Jung have launched Invention Science as a venue where inventors can pursue cross-disciplinary research into information technology, biotechnology, and nanotechnology without being restricted by corporate rules. The past few years have seen startups committed to pure invention sprouting up, as well as corporations large and small joining the bandwagon, spurred by the belief that creative people draw sustenance from being allowed to pursue problems they are interested in. Myhrvold thinks an inventing explosion is possible thanks to several factors, including the increasing simplicity of knowledge sharing through technologies such as the Internet; the accelerated pace of technological progress; and unexpected but critical changes brought on by converging technologies. The more inventor-friendly atmosphere that has emerged recently is attributable to a quartet of trends, Myhrvold and others attest: In the first trend, individual, non-corporate inventors are enjoying a resurgence after decades of being profiled as mere researchers when corporate labs dedicated themselves to shrinking product cycles. Venture capitalists, meanwhile, have grown more discriminating in the wake of disappointing returns on unclear schemes, and are demanding that the companies they fund have solid, patented inventions that will protect their investments from rivals. New global connections and competition is being established through the Internet and other widespread communications tools, and this is pressuring employees to be more inventive and original thinkers. Finally, the inventive process is being reexamined and re-envisioned as a focused and deliberate cognitive effort rather than one primarily driven by "accidental" discoveries.
    Click Here to View Full Article
    (Access to this article is available to paid subscribers only.)

  • "Software Safety By the Numbers"
    Embedded Systems Programming (05/04) Vol. 17, No. 5, P. 18; Payne, Jeff

    The International Electrotechnical Commission (IEC) has developed the IEC 61508 standard to outline the functional safety of programmable electronic systems, which entails a systematic development process that stresses traceability, criticality inspection, and validation. When designing safety systems conforming to the standard, developers are expected to start with a risk analysis to ascertain the required Safety Integrity Level (SIL), and safety requirements in IEC 61508 fall into two categories: Safety function requirements that control the input/output sequences for safety-critical operations, and safety integrity requirements consisting of failsafes to ensure the detection of failures and the system's switchover to a safe state should it be unable to carry out a safety function. A criticality analysis can be employed to establish the SIL based on the interaction between safety-critical, safety-related, and non-safety-critical elements, which includes classification of such components into one of four criticality levels--C3 (safety-critical), C2 (safety-related), C1 (interference-free), and C0 (not safety-related). This strategy allows development overhead and documentation for noncritical safety elements to be reduced. Software tools and their impact on system safety should not be disregarded, and there are two approaches vendors can take to determine if their products are suitable for safety applications: Certifying the product itself to the desired SIL (the favored approach), or basing the tool's SIL on the product's track record in production and standards adherence. If preexisting code is used in a safety-critical project, it is necessary to confirm that the code is proven in use while also following an authorized IEC 61508 development and modification process. The product must be thoroughly tested at every developmental stage, with fault-injection testing a vital component. Upon completion of validation, developers must monitor and control system modifications, evaluate any changes for potential effects on safety requirements, and complete a formal impact analysis once the changes have been approved.
    Click Here to View Full Article


 
    [ Archives ]  [ Home ]

 
HOME || ABOUT ACM || MEMBERSHIP || PUBLICATIONS || SPECIAL INTEREST GROUPS (SIGs) || EDUCATION || EVENTS & CONFERENCES || AWARDS || CHAPTERS || COMPUTING & PUBLIC POLICY || PRESSROOM