HomeFeedbackJoinShopSearch
Home

ACM TechNews sponsored by Looking for a NEW vehicle? Discover which ones are right for you from over 250 different makes and models. Your unbiased list of vehicles is based on your preferences and years of consumer input.    Looking for a NEW vehicle? Discover which ones are right for you from over 250 different makes and models. Your unbiased list of vehicles is based on your preferences and years of consumer input.
ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of either AutoChoice Advisor or ACM. To send comments, please write to [email protected].
Volume 6, Issue 686:  Wednesday, August 25, 2004

  • "Internet2: 2004 and Beyond"
    CNet (08/24/04); Reardon, Marguerite

    The Internet2 network's backbone, Abilene, facilitates collaboration and remote education between member universities and researchers through its high-bandwidth, low-latency infrastructure, and Internet2 researchers are working on a hybrid optical packet infrastructure (HOPI) to expand the network's reach even further. Internet2 supports over 227 universities, public schools, libraries, and research institutions, and links to upwards of 57 international high-capacity networks; Abilene was upgraded to 10 Gbps earlier this year, whereas 2.5 Gbps links are the average for most of the current public Internet. Internet2 enables applications such as peer-to-peer applications, remote control of lab equipment, distributed computing, and high-definition videoconferencing, but commercial demand for such applications has thus far been paltry. It is expected that the entry of college graduates who use Internet2-enabled applications into the workforce will spur demand for such applications in the corporate community. Internet2's director of backbone network infrastructure Steve Corbato says carriers must come up with a way to turn a profit from services sold over an Internet2-like network, adding that Internet2's financial model and that of the regular Internet are totally dissimilar. The HOPI project, a joint effort between Internet2 and National LambdaRail, aims to help usher in next-generation superfast networking by integrating IP packet switching and dynamically provisioned optical light paths, also known as lambdas. The project will employ wide-area lambdas that boast IP routers and lambda switches that can perform high capacity and dynamic provisioning. Another project focuses on improving middleware to boost the seamlessness and security of collaboration over the Internet2 network by promoting standardization and interoperability.
    Click Here to View Full Article

  • "Nation's Voting Machines Tested in Secret"
    Associated Press (08/23/04); Poovey, Bill; Werner, Erica; Konrad, Rachel

    The three companies charged with certifying voting machines in the United States--SysTest Labs, Wyle Laboratories, and CIBER--do not publicly disclose their testing methodology, nor do they reveal any vulnerabilities they run across, despite concerns over the security and reliability of e-voting systems that as many as 50 million Americans are expected to use on Nov. 2. This gives federal regulators practically no oversight over e-voting technology testing, which is paid for by the voting machine companies. Critics such as Stanford professor David Dill and Carnegie Mellon computer scientist Michael Shamos are outraged at the lack of election system transparency, which they argue is essential to the democratic process. Incidents of touchscreens malfunctioning during elections in U.S. states this year have raised doubts about the systems' security, while their lack of a paper trail is a particularly sore point should recounts become necessary. Some critics argue that the testing firms do not have the equipment to properly test the machines, but CIBER, Wyle, and SysTest employees and representatives are notoriously close-lipped and uncooperative about their certification procedures. The testers were selected over 10 years ago by the National Association of State Election Directors, under the authority of the Federal Election Commission. Former New York State elections director and chairman of the election directors' voting systems board Thomas Wilkey says the testers keep certification under wraps because the FEC has refused to take a lead role in the testers' selection, and because the government has refused to foot the bill.
    Click Here to View Full Article

    For information about ACM's activities regarding e-voting, visit http://www.acm.org/usacm.

  • "Concerns Mount Over Major Web Strike"
    eWeek (08/24/04); Morgenstern, David

    A rash of assaults on primary Internet servers and the recent defeat of the MD5 and Shah Level 0 encryption algorithms are raising concerns among Internet operators that a convergence of political activism and hacking is taking place. Compounding these fears are warnings from security experts that terrorists may launch a long-threatened "electronic jihad" against servers sometime this week; in fact, Kaspersky Labs International founder Yevgeny Kaspersky expects an attack against financial and political sites on Aug. 26, according to a Tuesday report from RIA Novosti. Kaspersky's warning appears to imply that the e-jihad will take the form of wide-scale distributed denial of service attacks such as the ones that targeted Akamai Technologies in June and DoubleClick's domain name system in July, although experts hint that major Internet services as well as root servers are under threat as well. Meanwhile, Packet Clearing House research director Bill Woodcock implies that Internet servers and ISPs could be threatened by the cracking of MD5 and Shah-0, which was detailed at the recent Crypto 2004 conference. The algorithms are employed in numerous commercial applications that include financial turnkey systems, enterprise content servers, and Internet routers. Woodcock likens the MD5 and Shah-0 circumvention to tumbling dominos: "A vulnerability is found, and a bunch of smart people follow the trail until bad things happen," he explains. The technique used to crack the algorithms may be unfeasible, but Woodcock notes that Internet operators are worried that Internet services will be adversely affected if hackers adopt and refine the method.
    Click Here to View Full Article

  • "Fight Over Ultrawideband Standards Turns Into Tug of War"
    USA Today (08/25/04) P. 1B; Davidson, Paul

    The IEEE is divided into two camps when it comes to ultrawideband (UWB) standardization, with major vendors such as Intel, Microsoft, and Texas Instruments set against Motorola and about 60 startup firms. Analysts say the drawn-out fight has delayed UWB rollout by about one year, though the companies say the standards fight has not hindered product development. Eventually, UWB is seen as an ideal technology for connecting home electronics because it is conservative with power, minimizes interference, and relies on unlicensed radio spectrums. The larger consortium backing orthogonal frequency division multiplexing (OFDM) claims greater range, more control over interference, and flexibility to adapt to international regulations. Motorola is leading a smaller group that supports direct sequencing (DS) UWB, which sends a continuous stream of data and uses less power than OFDM; DS supporters also have FCC approval and chip-sets on the market. Each side has ulterior motives, and observers liken the fight to that between Betamax vs. VHS, where supporters have invested millions of dollars in a specific technology: That has caused unusual friction in the IEEE debates, with charges of vote-stacking and impartiality on part of the UWB panel chairman, whose company has received investment funds from Motorola. The UWB standards fight has caused some to question the vote process and call for a one-company, one-vote system similar to that used at the Consumer Electronics Association, which chooses standards for TVs, PCs, and other devices. Some worry that UWB could take as long as Wi-Fi to become standardized--seven years--but too long a time frame would leave UWB's market vulnerable to competing technologies or de facto standardization efforts.
    Click Here to View Full Article

  • "Politicos Dig Deep for Your Data"
    Wired News (08/23/04); Gartner, John

    Political parties and activist organizations are working to sway voters by licensing and mining consumer data collected by marketing firms in order to better understand voter preferences and more narrowly target campaigns. Gun ownership, vehicle type, financial records, and magazine subscriptions are just a few of the categories organizations sift through and compare to voting records to find people who have not voted or are on the fence politically. MoveOn.org's Justin Ruben says the political action committee's "Leave No Voter Behind" program is using consumer data to build lists by ZIP code of people in swing states who either voted Republican or have not voted recently, identified by factors such as their race, marital status, and vehicle type; volunteers will then troll neighborhoods and speak face-to-face with prospective targeted voters, and report their progress through an Internet application. "They will ask people what the most important issues are, such as education or the war in Iraq, and then we provide them with fact sheets about the candidates' respective positions," notes Ruben. DePaul University marketing professor Bruce Newman says this election year is setting a new standard for market research spending by politicos, thanks to a greater amount of funding available because of the McCain-Feingold campaign finance law. Obtaining consumer data is only part of the equation, according to Newman: Political organizations must also research the issues that will spur inactive voters to vote or cause them to support one candidate, and Newman suggests that activists determine the three most important issues that would influence the voting of a particular demographic. He recommends that these issues be identified using statistical models employed in market research. "There's a real gap between data collection and analysis in many of these organizations," Newman points out.
    Click Here to View Full Article

  • "MEDai Wins Coveted KDD Cup for Predictive Modeling"
    PRNewswire (08/23/04)

    ACM's special interest group on Knowledge Discovery and Data Mining (SIGKDD) presented MEDai with the prestigious KDD Cup 2004 on Sunday, Aug. 22, 2004, during the SIGKDD Conference in Seattle, Wash. The Orlando-based company took first place for the second year in a row, and has placed among the top four in each of the previous two years. The prediction engine of Risk Navigator Clinical was MEDai's winning predictive technology, and serves as a tool that health care organizations can use to identify high-risk members that impact cost and quality of care. The technology's prediction engine uses linear and non-linear models to assess the member population of healthcare organizations. "Health plans, employers, and disease management organizations are realizing that predictive modeling is essential when addressing the pressures of managing their populations, providing quality health care and reducing costs," says MEDai CEO Steve Epstein. Hundreds of individuals and industry players from around the world participated in this year's KKD Cup competition, showcasing their work in predictive technology and data mining.
    Click Here to View Full Article

  • "Japan Designers Shoot for Supercomputer on a Chip"
    CNet (08/24/04); Kanellos, Michael

    Japanese researchers have developed a computer chip that efficiently handles numerous small calculations performed on a limited data set. The MDGrape 3 chip currently performs at 230 gigaflops when running at 350 MHz, and is designed for a petaflop supercomputer that focuses on life sciences applications. Computing applications in this field require in-depth examination of a relatively small set of data; the MDGrape 3 chip performs about 100 times better than general-purpose chips when tackling such specialized tasks, says RIKEN researcher Makoto Tanji, who presented the technology at the Hot Chips conference. RIKEN, or Japan's Research Institute of Physical and Chemical Research, has been extending the MDGrape architecture for life sciences applications for the last several years, building on work previously done at the University of Tokyo. Eventually, the group aims to make Protein 3000 machines that are able to determine the exact characteristics of 3,000 protein molecules sometime around 2007. The University of Tokyo, meanwhile, is working on a more generalized supercomputing chip that can perform at a speed of one teraflop and is similar to a supercomputer-on-a-chip collaboration between IBM and the University of Texas. MDGrape 3 uses 20 calculation pipelines compared to just one or two in normal chips, and a broadcast memory architecture that pushes data to those pipelines in tandem; the chip is also optimized for parallelization and built using a 130-nm process. Tanji says the specialized design means MDGrape 3 provides one gigaflop of computing power for just $15, using just 0.1 watts of electricity, while a Pentium 4 costs $400 and uses 14 watts per gigaflop and the IBM Blue Gene/L costs $640 and uses 6 watts per gigaflop.
    Click Here to View Full Article

  • "Selective Shutdown Protects Nets"
    Technology Research News (09/01/04); Patch, Kimberly

    Max Planck Institute researcher Adilson Motter has demonstrated that cascade failures triggered by assaults on large, central network nodes could be mitigated by shutting down peripheral nodes. The scientist has built a model showing that the scale of a cascade failure can be dramatically lowered if a certain population of nodes that manage small loads are deactivated before the cascade effect starts, while the overall network load is kept in balance. Finding the right nodes to eliminate is the key challenge, as the wrong nodes can worsen the cascade effect. Nodes have the dual purpose of transmitting and generating load, but central nodes are targeted by attackers because they more often serve as transmitters and thus play a major role in load balancing. Motter's model illustrates that cascade failures produced by sudden load shifts can be diminished by the removal of load-generating nodes, as well as by the shutdown of heavily-loaded connections that convey traffic from load-generating nodes to central distribution nodes. This scheme can be extended to power grids, which consist of generator stations that supply power, local stations that distribute power to customers, and transmission stations that carry power from generators to local stations; automatic devices along the transmission lines shut down grid components when their load becomes unmanageable, and Motter explains that transmission stations are most vulnerable to cascade effects. Intentionally disconnecting local stations from the transmission stations that are about to fail can reduce the size of the cascade, according to Motter's model. "It is still speculative to talk about practical applications [but] I hope to my work will motivate new studies on the control of cascading failures in realistic models of network systems," comments Motter.
    Click Here to View Full Article

  • "Putting People First in Tomorrow's Workplace"
    IST Results (08/24/04)

    Dealing with the sociological implications of future workspaces revolutionized by emergent information and communications technologies was the goal of the Information Society Technologies program's e-Locus project, which examined nine successful European initiatives in order to develop workspace design concepts and ideas for later research. Results included software solutions such as the Wings for Ships maritime weather information system, and methodologies like those suggested for a virtual environment workspace for the motor and aviation sectors. E-Locus coordinator Irene Lopez de Vallejo of Spain's Fundacion Tekniker says the project followed a multidisciplinary approach, employing designers, consultants, engineers, and computer scientists. A workspace's physical layout was an area of concentration for several projects: Vallejo notes, for example, that Italian office furniture makers in IDIA created a smart table equipped with a plasma screen for the purpose of exchanging information. "We noted a need for a wider approach in the design and construction stages," explains Vallejo. "That means more people from diverse backgrounds--including social scientists and architects, who are often left out of technology-driven projects." Another observation was the need for greater consideration of human factors in workspace design. Among the challenges that must be met is determining the look of future workspaces and who will work there, the kinds of technology required, society's reaction to spatially and temporally unbound workspaces, and how legal, ethical, and political issues are handled by policymakers.
    Click Here to View Full Article

  • "NASA Engineers Free Robonaut With Wheels, Leg"
    Space.com (08/23/04); Malik, Tariq

    The second iteration of NASA's robonaut series, Robonaut B, has been given greater mobility with the addition of a Segway-based robotic mobility platform and a "space leg" adapter that would enable the remote-controlled, wireless machine to move around in space by grappling the International Space Station (ISS). The adapter allows Robonaut B to plug into the ISS ports used by astronauts to restrain their feet, and the android has demonstrated its ability to use handrails to anchor itself to the station and move hand-over-hand, as well as employ tethers. The two-wheel mobility platform permits the robot to be driven over the Earth's surface, and robonaut project leader Robert Ambrose notes that a four-wheel or six-wheel platform is being considered for moon missions. Tests with the wheeled robonaut have shown that a single human operator could remotely control the machine's mobility and manipulation systems at the same time. Robonaut B boasts much smaller systems than its stationary predecessor, Robonaut A. "We basically scrubbed everything and replaced all of the electronics with much smaller miniature versions," says Ambrose. Robonaut B is programmed to override operator commands that could damage the robot, and can grasp certain objects without specific motion directions. The robonaut project is a joint initiative between NASA and the Defense Advanced Research Projects Agency.
    Click Here to View Full Article

  • "Quantum Encryption Poised to Tighten Data Security"
    EE Times (08/24/04); Brown, Chappell

    Doubts about the viability of quantum encryption have begun to surface over concerns that the technology appears to be following the same path as other encryption technologies: An allegedly impenetrable data security method migrates into the commercial space, only to be cracked by an attack its creators never anticipated. National Institute of Standards and Technology computer security expert Richard Kuhn has published a method by which a hacker could exploit the quantum entanglement of photons to acquire the quantum key that decrypts the data and resend the data without being spotted, despite arguments that any attempt to observe quantum-encoded information is detectable. Several quantum encryption schemes are vulnerable to Kuhn's man-in-the-middle attack strategy, but an BB84-based optical encryption scheme devised by Austrian researchers uses single photons rather than highly attenuated photon sources, complicating qubit detection and defeating Kuhn's technique. Security experts such as Kuhn contend that an absolutely impenetrable encryption scheme is impossible to realize because new technological developments can render long-standing defenses obsolete. Nonquantum encryption currently follows two fundamental strategies: Private keys are configured from data streams produced from complex bit transformations or from computationally difficult problems. However, the rapid growth of computers' computational capability only hastens the obsolescence of any encryption scheme. Despite these concerns, a number of quantum encryption schemes are in development or are being rolled out by various firms.
    Click Here to View Full Article

  • "New Software Turns Family Albums and Home Movies Into Picasso Masterpieces"
    University of Bath (08/25/04)

    Dr. Peter Hall and Dr. John Collomosse from the University of Bath's Department of Computer Science have developed software that can take photographs and automatically transform them into cubist-style pictures reminiscent of Picasso. The software gives the computer an "aesthetic sense" so that it can identify facial features and other aesthetically important elements in the photos or film footage: "When humans draw or paint they distill all the vast detail a camera sees into a few lines or daubs of paint, " explains Hall. "We plug digital cameras into our computers and write software that 'looks' for the same kind of important things as humans do." Photos of a subject taken from multiple perspectives are fed into the computer, and the software automatically recognizes key areas and separates them into chunks, which are statistically rearranged, randomly chosen, and distorted into a cubist representation. The software is part of a suite of imaging technologies that has the potential to dramatically change animation by rendering photos, videos, and movies as drawings, cartoons, and paintings. The software can eliminate the flickering effect typical of hand-painted key-frame animation, and allow animators to create a range of animation effects such as the insertion of streak lines behind moving objects in video with a minimum of manual effort. Only professional animators participating in the research are currently allowed to use Hall and Collomosse's software, but software and animation companies have expressed interest. The Engineering and Physical Sciences Council is funding the research.
    Click Here to View Full Article

  • "Military Swarm Study 'At the Edge of Chaos'"
    Mail & Guardian Online (08/24/04); Sands, Neil

    The Australian government's Defense Science and Technology Organization is working on software that replicates insect swarm behavior in order to develop teams of small, inexpensive, and disposable drone vehicles that use collective intelligence to carry out battlefield operations in hostile areas. This would reduce the need to send soldiers into dangerous situations, notes mathematician Alex Ryan, who is leading the effort. "We would have thousands of these unmanned vehicles communicating with one another to carry out missions," he explains. Ryan says the key challenge is providing the drones with an overall goal to pursue while making them flexible enough to adapt to changing circumstances. He remarks that recreating natural swarm behavior, such as the mass-attack strategy killer bees employ, is not his team's objective; the researchers are attempting to refine such behavior to build an intelligent communications network instead. Ryan acknowledges that the vehicle swarms could be modified to carry ordnance, but their initial perceived application is surveillance. He says the algorithms being used to mimic swarm behavior straddle the edge between order and disorder. Ryan expects the project to be completed within 10 to 15 years.
    Click Here to View Full Article

  • "Can a Robot Save Hubble? More Scientists Think So"
    New York Times (08/24/04) P. D1; Leary, Warren E.

    With the grounding of shuttle flights after the Columbia disaster, NASA administrator Sean O'Keefe had said the agency would scrap plans to repair the Hubble Space Telescope, and instead allow it to die when batteries and gyroscopes degrade completely. However, researchers at the Goddard Space Flight Center recently came up with a proposal to have a robot repair the telescope that has won over many skeptics, and gained the support of O'Keefe, who projects that more than $1 billion in new congressional funds will be needed. Robot repair would be a groundbreaking new step for NASA, but one that is in line with President Bush's exploration initiative that targets the Moon, and eventually Mars. Moon-based or orbital operations will require heavy use of robots for the assembly of spacecraft and other tasks, says NASA exploration associate administrator Craig Steidle. The major challenge will be integrating existing technology for the near-term mission, but NASA has several robot candidates ready, the first of which is the Canadian Space Agency's Dextre robot, originally intended for the International Space Station; that robot features two multi-jointed arms with pressure-sensitive clamps at the ends. A Hubble mission would likely include an expendable rocket that, with robot assistance, would dock with the telescope. The repair robot would then open the Hubble, replace parts and possibly perform other repairs and upgrades, and then the entire repair assembly would fall out of orbit with the old parts. Space Telescope Science Institute departing director Dr. Steven Beckwith was formerly skeptical of a robot repair solution, but has been increasingly convinced by the argument: "Now that servicing Hubble is coincident to NASA's new exploration goal, I think this is a great way to go," he says.
    Click Here to View Full Article
    (Articles published within 7 days can be accessed free of charge on this site. After 7 days, a pay-per-article option is available. First-time visitors will need to register.)

  • "ZigBee Takes It Easy"
    Technology Review (08/19/04); Brown, Eric S.

    ZigBee networking technology is soon to be included in professional installation kits for light switches, thermostats, security controls, and other simple electronic controls in the home. The ZigBee Alliance plans to build off a modest base on the home-automation market before it launches into grander pervasive technology schemes, such as city-wide biohazard monitoring. ZigBee is well-suited for home controls because it quickly sends small packets of information, such as keyboard inputs or on/off switching, over distances of up to 20 feet between nodes. Moreover, ZigBee data piggybacks on intermediary nodes and is only acted upon by the specified recipient device, allowing data to travel the entire network as long as there are available pathways. ZigBee batteries can last for years because the devices only turn on for the moment or so when they are needed, and the non-proprietary technology makes use of the same unlicensed 900 MHz and 2.4 GHz frequencies already in use for cordless phones, for example. ZigBee promises new home automation capabilities for home owners, possibly as early as 2006, while home builders like the technology because it offers a wireless alternative to costly cable installation. ZigBee-enabled light switches, for example, could be moved and leave no hole in the wall. Further out, ZigBee, which can be used in mesh networks of up to 65,000 nodes, promises new capabilities for building managers to monitor and even control systems.
    Click Here to View Full Article

  • "By the Book"
    InformationWeek (08/23/04) No. 1002, P. 36; Chabrow, Eric

    Undergraduate computer-science enrollments at U.S. universities are suffering a sharp decline, spurred by students' fears for job stability due to layoffs and offshore outsourcing, and by IT positions' loss of prestige and compensation due to the dot-com bust. Another factor in the enrollment fall-off is the growing prominence of biotechnology and other fields. The number of inbound undergraduate computer-science majors at Ph.D.-granting schools dipped from approximately 24,000 in 2000 to 17,700 in 2003, according to the Computer Research Association. This trend could spell trouble for the U.S. IT industry, as breakthrough innovations in robotics, artificial intelligence, and other emergent fields necessary for sustained economic advantage could slip, while a lack of skilled domestic IT staff could encourage U.S. companies to continue to offshore IT operations to countries where talented tech graduates are abundant. Another threat to computer science is the idea that students will not pursue IT careers because they think that all the cool research has already been done or that all the jobs have been offshored, though former ACM President Stuart Zweben of Ohio State University insists this is not the case. The quantity of IT graduates is one area of concern, but IBM fellow and author Grady Booch is even more concerned with quality: He reports that U.S. IT graduates are usually skilled in out-of-the-box thinking but are less adept at business skills such as project design and teamwork. A more interdisciplinary approach to computer-science education may be necessary, and some schools are following such a strategy by adding topics such as security, information assurance, and business processes to their core curriculums. Julie Horsman with Paccar's IT organization explains that university computer-science programs must train adaptable people who are competent in a wide spectrum of business-technology disciplines.
    Click Here to View Full Article

  • "6 Myths of IT"
    InfoWorld (08/16/04) Vol. 26, No. 33, P. 35; Zeichick, Alan; Scannell, Ed; Moore, Cathleen

    InfoWorld explores a half-dozen popular myths about IT, and determines that the foundations of all but one are factually unsound. The argument that server upgrades matter is a myth, by the simple observation that upgradability is unnecessary; backing up this claim is tier-one server manufacturers' refusal to provide statistics regarding upgrades to their products, although one anonymous vendor disclosed a marketing manager's informal comments that "the majority of customers purchase initially a server populated with the RAM and processors for future growth." A generalization of this trend would be that the hardware of smaller servers is less likely to be touched following the system's deployment. Mainframes are alleged to store 80 percent of corporate data, when in fact the average percentage is 50 percent or less: This myth stems from IBM's long-sustained monopolization of the mainframe market, but many believe this chokehold is loosening up with the advent of the Internet and SANs and NAS devices, as well as the spread of Web pages and digital data management and storage technologies. Analysis debunks the myth that CIOs and CTOs need more business savvy than technical expertise, when in fact a narrow focus on short-term gains can hurt a company's competitive edge by postponing long-term tech projects until the economic climate improves. Another myth posits that most IT projects are unsuccessful, yet failure is relative: Jim Shepherd of AMR Research defines failure as project abandonment or a project whose glitches seriously impair the user's business--but such scenarios are a rarity, in his evaluation. It is untrue that IT does not scale, as any technology can support scalability as long as the components are properly mixed and deployed effectively. The one IT myth InfoWorld finds to be close to the truth is that all major companies support heterogeneous platforms, a trend that analysts and experts say is growing because of mergers and acquisitions, the leverage heterogeneity offers, and enterprises' desire to be global, online, and in operation 24/7.
    Click Here to View Full Article

  • "Wi-Fi Plays Defense"
    Computerworld (08/23/04) Vol. 32, No. 34, P. 19; Wexler, Joanie

    Despite the ratification of the 802.11i standard for augmented wireless LAN (WLAN) security in June, WLANs still have a way to go before they are truly secure: The switch to 802.11i-enabled products and their adoption by IT organizations will not happen overnight, and the Wi-Fi Alliance is not slated to begin interoperability testing of 802.11i products until September. Other roadblocks to WLAN security include a lack of confidence among administrators that they can properly deploy security components in light of new intrusions, while older client devices such as bar code scanners that cannot be upgraded to 802.11i could be particularly vulnerable. Enhanced WLAN security has already been available for about a year and a half in Wi-Fi Protected Access (WPA), a subset of 802.11i that bolsters the security of encryption keys by having them rotate on a per-packet basis; the RC4 stream-cipher encryption used by WPA and 802.11's Wired Equivalent Privacy is supplanted by Advanced Encryption Standard (AES). Analyst Michael Disabato predicts that AES will be a universal requirement within three to five years, once RC4 is cracked. Gartner attributes 70 percent of successful Wi-Fi attacks it expects to transpire through 2006 to misconfigured access points and client software, and the SANS Institute is recommending regular wireless network audits to catch such vulnerabilities. Not all network administrators have a clear idea of how to integrate the various elements of WLAN security, which Dunkin' Donuts IT director Boris Shubin describes as "iffy." Chairman of the 802.11i Task Group Dave Halasz expects most enterprises to choose some form of EAP authentication according to the kind of database they possess. On the other hand, Tom Hagin of NetXperts remarks that Wi-Fi security has moved "from prepuberty to just past puberty" with the approval of the 802.11i standard.
    Click Here to View Full Article

  • "Software in the New Millennium: A Virtual Roundtable"
    IT Professional (08/04) Vol. 6, No. 4, P. 10

    Predictions about the future of software development by a virtual roundtable of experts polled by IT Professional appear to support former Computer editor-in-chief Ted Lewis' conclusions in an earlier survey that there is a wide gulf between academia and industry, with the former group intensely focusing on programming languages, operating systems, and algorithms, and the latter emphasizing standards and market-leading trends. Each surveyed expert has their own opinion on what the future of software in the next millennium entails: Joseph Williams with Microsoft Consulting Services says software applications, software development techniques, and the career of software engineering will all experience dramatic changes, while software research and development engineer Dan Ferrante defines software's future in terms of its ability to simplify jobs and life. Iowa State University's J. Morris Chang contends that software's impact on how people work and live should be the primary consideration, while IT consultant Thomas C. Jepsen says software will be increasingly defined as an abstraction for human-machine communications; Mitretek Systems CTO Gilbert Miller expects software to deliver functionality and as-yet unseen practical applications. Technologies the roundtable members think deserve close attention include open-source technologies, nanotechnology, a revamping of the way standardization bodies operate, and a move away from incremental improvements and toward more radical innovations. The respondents offer a diversity of trends they expect to profoundly influence the future of software in the next decade, including outsourcing, Web services, greater availability, and distributed computing interoperability bolstered by the popularity of virtual machines and their interactions with the Internet. Those polled have various opinions on how future software engineers or IT workers should be educated: Williams calls for an emphasis on business and personal innovation skills, while Penn State University professor Philip A. Laplante wants greater concentration on math, computer engineering, and hard-core computer science studies; Ferrante, meanwhile, thinks every software engineer college graduate should be well-versed in Java, Python, Perl, C++, and C#.
    Click Here to View Full Article


 
    [ Archives ]  [ Home ]

 
HOME || ABOUT ACM || MEMBERSHIP || PUBLICATIONS || SPECIAL INTEREST GROUPS (SIGs) || EDUCATION || EVENTS & CONFERENCES || AWARDS || CHAPTERS || COMPUTING & PUBLIC POLICY || PRESSROOM