Six Easy Pieces: How to Do Cost of Ownership Analysis Better

If cost of ownership analysis is a painful exercise for IT organizations, why has almost every company done it (and continued to do it) multiple times? In this article, we will identify six key elements to effective cost of ownership analysis, which you can use these to improve the accuracy and eliminate the frustration associated with this necessary step in your IT evolution. 1) Analyze Platforms, Not Servers First evaluate the current "platforms" within your environment, including all servers of all types in order to simplify the process. Simply because management requires an accurate understanding of current IT costs and strengths so they can better assess new ideas and technologies.

One of the most difficult things to "get right" in an analysis of this type is an exact match between a given technology and the associated costs. Limiting the scope makes cost of acquisition simple to determine but it makes every other cost almost impossible to quantify without controversy. The easiest way to do this is not to limit the technology scope to a few machines or a single new application but to expand it to match all the technology in the IT budget. A platform approach will result in development of a new "view" of the IT budget that is platform based. This gives the study team tremendous leverage if discussions should wander to "I think this amount is too high for platform A." So if the amount is reduced for platform A, it must be raised for platform B. What does B think of that?

The advantage to this approach is that the total in this view should match the total in the budget. This places the entire cost discussion on solid footing - the IT budget - and allows the process to be managed dispassionately, a key to later acceptance of the results. 2) Focus on a Representative Application and Include All the Pieces Next, let's consider a new business critical application or workload that requires platform selection. By definition, a critical application will require careful design, careful sizing, careful maintenance, operation, support, and disaster recoverability. Once again, the key to success is to not limit the view to a subset of components. It may also require a new or dedicated infrastructure, but at a minimum, it will tax existing infrastructure. The "view" developed in the previous step should facilitate this type of analysis.

Each of these components and their associated costs, should be included in any cost of ownership comparison. Over the past ten years, our group within IBM Lab Services has been doing IT Systems and Storage Optimization ("Scorpion") Studies that focus on this type of view and component based analysis. This means that any analysis that omits those other components for support, maintenance, and disaster recovery, etc. may miss half of the real costs. Our findings show that a typical ratio of production Web, application, and database servers to "everything else" is about one to one. This discrepancy grows for very large critical applications and is largely why our industry hasn't done so well sizing many new enterprise application suites. Vendors feed this controversy to gain competitive advantage.

We've all heard the stories. 3) Consider Practical Capacity, Not Vendor Ratings System capacity and performance can quickly become a very tedious and esoteric discussion, and in many cost of ownership efforts, it does. This can be avoided. Often, distributed server utilizations are very low and there is a good reason for it. Our experience is that the most important aspect of performance analysis within cost of ownership is not which vendor claim or benchmark is used as a base, but rather (a) what system utilizations are "normal" in your current environment?; and (b) what is a reasonable expectation for the future? An underutilized server requires no capacity planning. If average server utilizations in your environment are low, model a future state of 2 or 3 times the current for each component in the possible solution.

Most cost analyses are considered part of a technology acquisition process so higher future state utilizations are assumed. No higher. This is particularly true with the rise of virtualization, which is almost always assumed in cost of ownership comparisons. Use any reasonable performance metric - expected utilizations are far more important. Transitioning from a non-virtualized to virtualized server environment has some significant advantages including higher potential utilization.

Don't assume world class utilization numbers unless you know what kind of effort it will take to attain them. It has a cost, however, capacity must be managed. IBM's System z mainframe environments typically demonstrate this fact quite well. They can do this because the level of internal standardization and automation is much higher than other platforms. Mainframes usually run at very high utilizations around the clock. Other platforms will eventually attain these levels, but that is still years of vendor development away. 4) Don't Ignore Labor Costs to Protect the Innocent The most difficult topic within cost of ownership is undoubtedly the cost of labor.

High Full Time Equivalent (FTE) ratios have been an industry target for years and most IT professionals can quote the current best practice and describe how they are exceeding it. In a down economic cycle, most staff see nothing positive in quantifying the cost of labor for "their" platform. Therein lies a problem. The extent to which strategy (b) is used differs by platform for a variety of reasons, but the result is the same. IT infrastructure support organizations have been managing to these ratios for years using two basic strategies: (a) improve efficiency; or (b) push work onto other parts of the IT organization. Any cost of ownership analysis that limits labor calculations to IT infrastructure support headcount will likely miss major portions of the real support costs and skew the results.

Consider the entire IT organization and apportion every group that is not truly platform neutral (and even individuals within an otherwise neutral group, like network support) to the appropriate platform labor category. A good solution to this problem is an approach similar to item one. The same kind of "view" is now developed for assignment of labor with the underlying organization chart as the foundation. They will be higher - up to 2 times or more on x86 platforms - and will reflect true insight to cost. The results will look quite different from industry published norms. Because the resulting labor cost numbers cross organizational lines, no one group will feel responsible for them, or for lowering them.

A side benefit to the process stems from the two strategies often used to manage FTE ratios - productivity improvement and narrowing of responsibility. Resistance to the process will be lessened and again, buy-in should be improved. If the FTE ratios are changed significantly by the new "view" of the organization, the need for productivity tools will be evident. This is in stark contrast to the mainframe where software costs tend to be high while systems are maintained at strict currency levels, with the result that staffing has been flat or dropping for years with steadily improving Quality of Service (QoS). 5) Quantify QoS in a Way that Makes Sense QoS is an elusive topic since it has so many aspects that differ in importance between companies, but some general trends can give guidance. In the years that we have been working with customers doing these studies, we've seen an alarming trend toward high complexity within distributed systems - old hardware, old software, multiple releases of everything to be maintained - and a lack of investment in systems management software.

In this age of real-time systems, disaster recovery has become a universal need. The majority of customers we've worked with have done the former and not the latter because of the cost. Two key metrics in disaster recovery are Recovery Time Objective (RTO - the time to bring alternative systems online for use) and Recovery Point Objective (RPO - the age of the data on those recovered systems). If we consider the dominant RTO and RPO for a given platform, we gain insight into both cost and QoS. Though any system can be made disaster recoverable, there is a huge cost differential between making a single mainframe recoverable and making 1,000 distributed systems recoverable. This can be quantified very easily with a call to a recovery services provider and should be included in the platform cost comparison. As IT will have to compete with public clouds the above cost analysis can be used to set internal cost guidelines for a corporate cloud infrastructure very early in the cloud development process. Cloud computing and other metered services will certainly offer a recoverable option and here lies an opportunity.

Or it can be used to steer workload onto the platform that is already recoverable, thus eliminating some of the need to develop a recovery capability where it currently does not exist. 6) Look at Costs Incrementally - Plot Your Own Course The last topic to consider is primarily financial. Just as the first chip to roll off a fabrication line is worth billions and the second worth pennies, the first workload for a platform is far more expensive to provision than subsequent ones. There is a "sunk cost" and an "incremental cost" associated with IT infrastructure that must be considered. This is especially true for the mainframe since the technology may be physically refreshed, but financially the transaction is handled as an upgrade. IBM has taken this concept a step further with the arrival of mainframe "specialty" engines that have much lower price points and drastically reduced impact on software costs. This is unlike the distributed world where technology and book value are tied together.

However, they cannot run alone, they must be added to an existing system. These kinds of dramatic differences must be considered in cost of ownership and are often large enough to justify a change in course for IT. Exploiting these areas of low incremental cost to support growth can significantly improve the overall cost of IT. Virtualization is expected to have a significant effect on other platforms, so the need is universal and growing. It is not unusual for mainframe systems in production to cost $4,000/MIP while specialty engine upgrades may run only $200/MIP. The incremental costs on the mainframe in this case are 1/20th of current cost. With 33 years at IBM, John Schlosser is currently a Senior Managing Consultant for the Scorpion practice within IBM Systems and Technology Group - Lab Services & Training. He is a founding member of the group which was started in 1999. He has developed and modified many of the methodologies the team uses for IT infrastructure cost analysis.

PayChoice offers more details about data breach

PayChoice, which this week confirmed that its online payroll systems operations were breached on Sept. 23, is now beginning to offer details on what it thinks may have happened. PayChoice today tells Network World "the company was preparing a timely public statement before the Washington Post report." "We are concerned that PayChoice has joined a growing list of other well-known firms that have been victimized by cyber criminals," says PayChoice CEO Robert Digby in a statement. The company did not publicly inform the media until earlier this week when Washington Post columnist Brian Krebs revealed some information known about the intrusion. That ever-growing list, of course, could include Heartland Payment Systems, which disclosed a data breach earlier this year that has had enormous impact on banking and card processing as it became known that cybercriminals had a chance to dip into information about 100 million payment cards.

The same could be said about Hannaford Brothers, the Portland, Maine-based supermarket chain, whose CEO Ronald Hodge stepped forward last year to disclose a breach there of customer payment information. But that incident came to light because CEO Robert Carr coordinated an outreach to proactively inform the public, through the media, about its data breach and has not shied from taking tough questions. Morristown, N.J.-based PayChoice provides payroll processing services and also licenses its payroll-management product to 240 payroll-processing firms serving 125,000 organizations. But the firm adds "clients should notify employees to carefully review their bank, credit card and other statements and to notify law enforcement officials immediately if they discover suspicious activity." The firm says it has also engaged forensics experts to investigate further and according to Digby's statement, "we will be reviewing all aspects of our security protocol to add any additional necessary protective measures." The company says it became aware of the attack "when it saw what appeared to be phishing e-mails telling clients they should download a browser plug-in to continue using their online accounts," PayChoice says in its statement. "The e-mails included client user names and partial passwords, which indicated a breach of PayChoice's Online Employer website." PayChoice says "within hours of the attack, the company notified its clients, shut down the site, and deployed further security measures to protect client information before restoring access to the system." PayChoice has also notified authorities and federal law enforcement. "Only customers using Online Employer were affected," PayChoice said in its statement. "The majority of PayChoice's clients, those using telephone, fax or other non-Web-based input methods, were not impacted." PayChoice contends there's no evidence of unauthorized access to sensitive employee information.

DHS to get big boost in cybersecurity spending in 2010

The U.S. Department of Homeland Security will likely have a substantially bigger cybersecurity budget for fiscal 2010 compared to this year. The amount is $84 million, or about 27%, higher than the $313 million that was allocated for information security spending in 2009. In approving the amount, the Senate said the increase was aimed at expediting efforts to combat cyberthreats using such measures as reducing the points of Internet access across the department and increasing security training and management capabilities. The U.S. Senate yesterday passed legislation approving nearly $43 billion for the DHS for fiscal 2010. Of this, about $397 million is meant for addressing cybersecurity issues within the agency. The Senate measure reconciles different appropriation bills that were passed by the Senate and House appropriation committees.

The increase approved for DHS cybersecurity is "pretty hefty," said John Pescatore an analyst with Gartner Inc. in Stamford, Conn. "The real key thing is where does the money go?" he said. "How much of it will fund the same old stuff and how much of it will be on R&D?" The amount set aside for DHS cybersecurity spending next year looks "about right" said Karen Evans, former de factor federal CIO under the Bush administration. The bill is headed to President Obama for his signature. The amount is meant for DHS internal operations only and is consistent with increases provided to other departments and agencies, she said. The $43 billion appropriations bill included similar increases in other technology areas. Cybersecurity spending across the government for 2010 is projected at about $7.5 billion, or about 10% of the total IT budget of $75 billion, she added. A budget of about $1 billion, or about $75 million more than 2009, was approved for the DHS's department of science and technology, which conducts research on such areas as cybersecurity and air cargo security.

Obama's economic stimulus package, which passed earlier this year, provided another $248 million for planning, design, IT infrastructure, fixtures and other costs related to the consolidation. The bill also provides an additional $91 million on top of a previously approved $60 million for a massive data center migration and consolidation effort underway at the DHS. The consolidation is part of a move by the DHS to build a new 4.5 million-square-foot facility in the Washington area. The Senate bill also extended the DHS' online employment eligibility verification program, called E-Verify, by three years and allocated $137 million for operating the system and for improving its reliability and accuracy . Opponents of E-Verify had claimed the system was too buggy and error-prone to be used as the federal government's primary tool for employment verification. It also require linking to driver's license databases around the country. Meanwhile, in what appears to be a reaction to the growing opposition to the program , funding for the controversial Real ID initiative was cut back 40% from $100 million to $60 million in 2010. Real ID, launched during the Bush administratiion, requires states to meet new federal standards for issuing driver's licenses.

Chip maker TSMC buys stake in solar cell maker Motech

Taiwan Semiconductor Manufacturing (TSMC) bought a 20 percent stake of solar cell maker Motech Industries for NT$6.2 billion (US$193 million), the companies said in a joint statement on Wednesday. The two companies will work together on new business ventures, and TSMC will work with Motech to launch new products faster and evaluate other opportunities in the solar business, the companies said. Motech, Taiwan's largest maker of solar cells, the key component of solar panels and solar modules, will become a key part of TSMC's move into green industries through the investment.

TSMC is the world's largest contract chip maker. TSMC revealed a plan earlier this year to open a new business related to energy conservation, including the solar and LED industries. The company as well as other chip makers work with the same polysilicon material used for solar cells and believe they have an edge in the industry due to materials research as well as expertise in manufacturing and management. The company invested US$46 million earlier this year to open a LED (light-emitting diode) production line in central Taiwan, a move investment bank Credit Suisse called a first step into the LED lighting business. The solar and other energy alternative industries are also being nurtured by governments such as the U.S. with stimulus money meant to battle the global recession. At TSMC's second-quarter earnings conference, Chairman and CEO Morris Chang said the solar and LED will likely generate revenue as high as US$10 billion to $15 billion for TSMC by 2018. Governments worldwide, including Germany and the U.S. state of California, have offered incentives for people to use solar panels due to rising oil prices.

Apple's iPad marketing sparks complaint to FTC

Apple's iPad, announced Wednesday, has already led to one complaint to the U.S. Federal Trade Commission in which a consumer charged Apple with false advertising by showing Adobe Flash working on the device. At its launch event in San Francisco, Apple showed a video of people using the device, and a series of images of the iPad dominates the company's home page. Apple CEO Steve Jobs unveiled the iPad to a waiting world this week after months of speculation and rumors about the slim tablet computer, and Apple's marketing machine has continued to roll since then.

But those promotions are misleading, according to Paul Threatt, a Web and graphic designer who lives near Atlanta. In fact, Adobe says more than 70 percent of all games and 75 percent of all video on the Web uses Flash. Both show the iPad displaying elements of the online New York Times that cannot be viewed without Flash, even though the device apparently doesn't support that software. "I don't hold anything against them for not supporting Flash," Threatt said. "It'd be great if they did, but what I don't want them to do is misrepresent the device's capabilities." Flash is used to deliver multimedia, games and other content on many Websites. But it appears that the iPad, like the iPhone and iPod Touch, doesn't support that format. Adobe was not approached by Apple before the product was announced, Adobe spokesman Stefan Offermann said, indicating Flash is probably not in the works for the shipping version of the iPad, due to hit stores in about 60 days. In a live demonstration at Wednesday's launch event, icons that show a missing plug-in popped up on the iPad's screen when Jobs was showing off the Web front page of the New York Times.

However, some of the same New York Times content that wouldn't display during the live demonstration shows up just fine in a promotional video and one of the pictures on Apple's home page, Threatt pointed out. Likewise, a slideshow of images accompanying the article "The 31 Places to Go in 2010" require Flash, yet one of those pictures shows up in the promotional video and an image of the iPad on Apple's page. (In another twist, in the video, the slideshow begins at what appears to be the 14th image, even though on the Web it starts with the first picture.) Threatt said he was clued into the discrepancies by a Friday post at the AppleInsider blog. A collection of still images halfway down the Times front page, which represent segments on the site's video section, need Flash to appear but are visible as a user looks over the front page in the Apple clip. He was aware of the FTC's online Complaint Assistant, because he had used it before. "Whenever the urge strikes me and I feel like someone is being deliberately misleading, I go to that site," Threatt said. "I never know if I'm screaming into the void or not, but it makes me feel better." In his complaint, Threatt briefly explained the technologies to the FTC, and then summed up the problem. "In several advertisements and images representing the Apple products in question, Apple has purposefully elected to show these devices correctly displaying content that necessitates the Adobe Flash plug-in. Apple did not immediately respond to a request for comment. This is not possible on the actual devices, and Apple is very aware of that fact. ... This constitutes willful false advertising and Apple's advertising practices for the iPhone, iPod Touch, and the new iPad should be forcibly changed," Threatt wrote, in part.

The FTC was not able to give details about the complaint or how it would respond. He described himself as a longtime Apple fan who used to work weekends at the Apple Store in Tyson's Corner, Virginia. "I'm big into Apple, and I always liked helping people learn Apple stuff," Threatt said, adding that it pains him to turn against the company. Threatt wasn't an obvious candidate to lodge the complaint. As for the iPad, Threatt said he would love it as a step up from his iPod Touch - if only it had a video camera.

Lotus user wary of social networking tool rollout

As IBM moves to upgrade its cache of social networking tools, some users are taking a cautious approach to the technology while figuring out where it will apply and how to measure its effectiveness. The new 2.5 version software includes micro-blogging, file sharing and new mobile capabilities. Where IT pros do their social networking IBM Tuesday unveiled Lotus Connections 2.5, its upgraded lineup of social networking tools that are a major expansion to the company's suite of collaboration software. But some of the features are expanding faster than users' plans to utilize the software.

The company's manager of messaging and collaboration asked for anonymity because he was not authorized to speak on the record. One Connections 2.5 beta tester, a global consumer product corporation, is taking a deliberately slow approach to rolling out the social collaboration tools. The company started slow with a few hundred users who were only allowed to communicate with each other. At that point, the manager says, the number of users exploded by 650% to a few thousand. The group's size was eventually doubled and then the tools were opened up companywide. Despite the growth, the company is still "seeding the environment," said the manager, but a broader rollout is planned.

We will likely "wind up doing it anecdotally," said the manager. "The things we're struggling with there is that this doesn't match the ROI [metrics that executives] are used to looking at. The harder part to plan is the expected results because the company has yet to figure out how to measure its return on investment. How do you measure, 'we recruited this person because of the [collaboration tool]?'" While results are hard to gauge, the broader, anticipated benefits are being defined in the context of capturing and recording corporate knowledge. The worker could develop a how-to guide for use by others, he said. For example, a certain administrative assistant may routinely be tasked with booking a certain type of event, said the manager.

The manager said it is a good time to ramp up internal communities and knowledge-sharing because as the economy and job markets rebound, workers who may have suffered pay or benefit cuts amid the recession will be looking to move on. "Now is the time to get people to put information in, so you're not losing it on the back of a Post-it note." Follow John on Twitter. -Kanaracus is with the IDG News Service Follow Chris on Twitter.

Snow Leopard bug deletes all user data

Snow Leopard users have reported that they've lost all their personal data when they've logged into a "Guest" account after upgrading from Leopard, according to messages on Apple's support forum. The MacFixIt site first reported the problem more than a month ago. The bug, users said in a well-read thread on Apple's support forum, resets all settings on the Mac, resets all applications' settings and erases the contents of critical folders containing documents, photos and music.

Users claimed that they lost data when they'd logged into their Macs using a "Guest" account, either purposefully or by accident. Specifically, Snow Leopard's home directory - the one sporting the name of the Mac's primary user - is replaced with a new, empty copy after users log-in to a Guest account, log out, then log-in to their standard account. Reports of the bug go back to Sept. 3, just six days after Apple launched Snow Leopard , or Mac OS X 10.6. Users who said they'd encountered the bug said that they had upgraded their systems from Mac OS X 10.5, known as Leopard. All the standard folders - Documents, Downloads, Music, Picture and others - are empty, while the Desktop and Dock have reverted to an "out-of-box" condition. "I had the Guest account enabled on my MacBook Pro," said a user identified as "tcnsdca" in a message posted Sept. 3. "I accidentally clicked on that when I went to log in. All of doc, music, etc. gone." "Add my parents to the list of people waxed by this bug," added "Ratty Mouse" today on the same thread. "Brand new iMac, less than one month old, EVERYTHING lost. It took a few minutes to log in, then after I had logged out of that account and back into mine, my [entire] home directory had been wiped.

Just as I convinced them to go Mac after years of trying." On the thread, several users urged others to disable any Guest accounts to prevent any accidental data loss. This morning I had access to Guest Account and than all my data were lost!!!" bemoaned someone tagged as "carlodituri" last Saturday. "I had 250GB of data without backup and I lost everything: years and years of documents, pictures, video, music!!! Some people were able to restore their Macs using recent Time Machine backups, but others admitted that they had not backed up their machines for weeks or months. "Just my luck I hadn't made a backup since 11th August," acknowledged "rogerss" on a different support forum thread. "So annoyed now, in the process of restoring from Time Machine, but have lost loads of my work due to this fault." Others users, however, had neglected to back up their Macs. "Nooooo!!! Is it possible to recover something? Some, for instance, wondered if the data loss would be triggered on Macs upgraded to Snow Leopard when the Guest account was simply set to "Sharing only," which is the default. Please help me!!!!" Not surprisingly, users unaffected by the bug were reluctant to attempt to reproduce the problem.

Apple did not respond today to questions about the bug.

Heartland CEO: Credit card encryption needed

Credit card transactions in the U.S. are often not encrypted, and credit card vendors, payment processors and retailers need to embrace an encryption standard to protect credit card numbers, the CEO of a breached payment processor said Monday. Heartland in January announced the discovery of a data breach that left tens of millions of credit card numbers exposed to a gang of hackers. "I now know that this industry needs to, and can, do more to better protect it against the ever-more-sophisticated methods used by these cybercriminals," Carr told the Senate Homeland Security and Governmental Affairs Committee. "I believe it is critical to implement new technology, not just at Heartland, but industrywide." The purpose of the committee hearing was, in part, to determine whether new legislation is needed to fight cybercrime. Credit card numbers are not now required in payment card industry guidelines to be encrypted in transit between retailers, payment processors and card issuers, Robert Carr, chairman and CEO of Heartland Payment Systems, told a U.S. Senate committee. Heartland is pushing for the credit card industry to adopt an end-to-end encryption standard, he said, and the company is deploying tamper-resistant point-of-sale terminals at its member retailers. "Our goal is to completely remove payment account numbers of credit and debit cards and magnetic-stripe data so they are never accessible in a useable format in the merchant or processor systems," Carr said.

The company has also helped to form an information-sharing council for payment processors, where the companies can share information about threats, vulnerabilities and best practices, he said. "We are working on these solutions, both technological and cooperative, because I don't want anyone else in our industry, or our customers, or their customers ... to fall victim to these cybercriminals," he said. Heartland has asked credit card companies to accept encrypted transactions and the company has engaged standards bodies and encryption vendors, Carr said. Carr didn't give details about the Heartland breach, in which the company was compromised for about a year-and-a-half. However, Heartland paid about US$32 million in the first half of 2009 for forensic investigations, legal work and other charges related to the breach, he said. The company remains involved in investigations and lawsuits involving the breach, he said.

Senators asked Carr some pointed questions about the breach. Senator Joe Lieberman, an independent from Connecticut, asked Carr about the extent of the breach. Senator Susan Collins, a Maine Republican, wanted to know how the company could be compromised from October 2006 to May 2008 without discovering the breach. "I was astounded at what a long period elapsed where these hackers were able to steal these credit card numbers," she said. "Explain to me how a breach of that magnitude could go undetected for so long." Card holders were not reporting major breaches, Carr answered. "The way breaches are normally detected is that fraudulent uses of cards are determined," he said. "There was no hint of fraudulent use of cards that came to our attention until toward the end of 2008." Collins pressed him further. "But are there no computer programs that one can use to check to see if an intrusion has occurred?" she asked. "There are, and the cybercriminals are very good at masking themselves," Carr said. In August, Albert Gonzalez of Miami was indicted in New Jersey for the theft of more than 130 million credit and debit cards, according to the U.S. Department of Justice. Gonzalez pleaded guilty last week to separate charges in Massachusetts and New York.

He was charged, along with two unnamed co-conspirators, with using SQL injection attacks to steal credit and debit card information from Heartland, 7-Eleven and Hannaford Brothers, a Maine-based supermarket chain. It's too soon to tell how many credit card numbers processed by Heartland were compromised, Carr said. "We don't know the extent of the fraud at this point," he said. "It's a significant compromise."

The six greatest threats to US cybersecurity

It's not a very good day when a security report concludes: Disruptive cyber activities expected to become the norm in future political and military conflicts. From the GAO: "The growing connectivity between information systems, the Internet, and other infrastructures creates opportunities for attackers to disrupt telecommunications, electrical power, and other critical services. But such was the case today as the Government Accountability Office today took yet another critical look at the US federal security systems and found most of them lacking.

As government, private sector, and personal activities continue to move to networked operations, as digital systems add ever more capabilities, as wireless systems become more ubiquitous, and as the design, manufacture, and service of information technology have moved overseas, the threat will continue to grow. " Within today's report, the GAO broadly outline the groups and types of individuals considered to be what it called key sources of cyber threats to our nation's information systems and cyber infrastructures. According to the Director of National Intelligence, a growing array of state and nonstate adversaries are increasingly targeting—for exploitation and potential disruption or destruction—information infrastructure, including the Internet, telecommunications networks, computer systems, and embedded processors and controllers in critical industries. From the GAO: Foreign nations: Foreign intelligence services use cyber tools as part of their information gathering and espionage activities. Criminal groups: There is an increased use of cyber intrusions by criminal groups that attack systems for monetary gain. While remote cracking once required a fair amount of skill or computer knowledge, hackers can now download attack scripts and protocols from the Internet and launch them against victim sites.

Hackers: Hackers sometimes crack into networks for the thrill of the challenge or for bragging rights in the hacker community. Thus, attack tools have become more sophisticated and easier to use. These groups and individuals overload e-mail servers and hack into Web sites to send a political message. Hacktivists: Hacktivism refers to politically motivated attacks on publicly accessible Web pages or e-mail servers. Disgruntled insiders:The disgruntled insider, working from within an organization, is a principal source of computer crimes.

The insider threat also includes contractor personnel. Insiders may not need a great deal of knowledge about computer intrusions because their knowledge of a victim system often allows them to gain unrestricted access to cause damage to the system or to steal system data. Terrorists: Terrorists seek to destroy, incapacitate, or exploit critical infrastructures to threaten national security, cause mass casualties, weaken the U.S. economy, and damage public morale and confidence. The Central Intelligence Agency believes terrorists will stay focused on traditional attack methods, but it anticipates growing cyber threats as a more technically competent generation enters the ranks. However, traditional terrorist adversaries of the United States have been less developed in their computer network capabilities than other adversaries. Testifying before the Senate Judiciary Committee, Subcommittee on Terrorism and Homeland Security today, FBI Deputy Assistant Director, Cyber Division said that while the FBI has not yet seen a high level of end-to-end cyber sophistication within terrorist organizations, it is aware of and investigating individuals who are affiliated with or sympathetic to al Qaeda who have recognized and discussed the vulnerabilities of the U.S. infrastructure to cyber attack; who have demonstrated an interest in elevating their computer hacking skills; and who are seeking more sophisticated capabilities from outside of their close-knit circles. "In addition, it is always worth remaining mindful that terrorists do not require long term, persistent network access to accomplish some or all of their goals.

The likelihood that such an opportunity will present itself to terrorists is increased by the fact that we, as a nation, continue to deploy new technologies without having in place sufficient hardware or software assurance schemes, or sufficient security processes that extend through the entire lifecycle of our networks," Chabinsky said. Rather, a compelling act of terror in cyberspace could take advantage of a limited window of opportunity to access and then destroy portions of our networked infrastructure.

Mac News Briefs: PDFpen has new OCR engine

SmileOnMyMac Software has updated PDFpen, incorporating Nuance Communications' OmniPage OCR engine into the PDF editing program. SmileOnMyMac lauded the OmniPage OCR engine for its accuracy. PDFpen 4.5 uses version 15.5 of the OmniPage OCR, replacing the Tesseract open-source OCR engine in PDFpen on Intel-based Macs. Beside the new OCR engine, PDFpen 4.5 lets Snow Leopard users scan directly into the application from Image Capture or TWAIN scanners.

The 4.5 update is free for registered users of PDFpen 4.x. The PDF editing application costs $50, with a Pro version available for $100. Both PDFpen and PDFpenPro run on Mac OS X 10.4 and later.-Philip Michaels Typinator features DropBox syncing Ergonis Software released a new version of Typinator, its text-replacement utility. There's also a new text highlighting tool that selects and highlights text in a single action. Typinator 3.6 features automatic syncing with DropBox, a tool for syncing files across multiple machines (and online). Taking advantage of the new capability is as simple as modifying Typinator's preferences to store its settings folder within the DropBox folder. Typinator 3.6 is available now from the company's web site, for €19.95 per single-computer license, or €34.99 for a two-machine license. The updated Typinator also allows abbreviations that begin with a space, features a simplified registration interface, and offers numerous speed and memory usage improvements. The update is free to anyone who bought the application in the last two years.-Rob Griffiths Real Software updates development applications RealBasic and Real Studio 2009, Release 4 shipped Tuesday, adding 97 enhancements and 39 new features to the cross-platform software development tools, according to developer Real Software.

The report editor lets developers visually create a layout for printing by dragging and dropping labels, fields, images, and more. Leading the changes to this latest version of RealBasic is a new report editor, which Real Software says will be included in all RealBasic versions. The editor creates both single- and multi-page reports. The feature lets developers automate the most common functions of building applications without having to write IDE scripts. Real Studio also gets a new build animation feature for its Project Editor.

A complete list of what's new in Release 4 is available on Real Software's downloads page. It supports many formats including AVI, WMV, MOV, MPG, ASF, and DivX. The application automatically provides ideal default settings and offers the flexibility to crop video, set duration, adjust quality, and control many other audio and video preferences. The software maker also provides a video highlighting new features in RealBasic and Real Studio.-PM Macvide announces VideoFlash Converter 2.9 Macvide has announced VideoFlash Converter 2.9, an update of its video-to-Flash conversion utility for Mac OS X. VideoFlash Converter allows conversion of QuickTime-compatible video files to Adobe Flash. Version 2.9 also includes a new Web update and other fixes. VideoFlash Converter gives you the option of creating an HTML file along with the video and lets you customize how viewers see it. You can use the program to have Flash videos play directly in a Web page, not in a new window or separate page.

You can designate that the video start automatically and continuously play when viewers access the page, for example. The software works with OS X 10.4 (Tiger) or 10.5 (Leopard) and is a Universal app. The app also integrates with iWeb. VideoFlash Converter is available for $40 per single license, and can be downloaded from the Macvide Web site.-Jackie Dove Algoriddim releases Djay3 Algoriddim has released Djay 3, a revamped version of its music software application for Mac and iTunes. The program's interface has also been redesigned. It offers a host of new features, including automatic tempo and beat detection, auto-cut scratching, and MIDI support.

With the new version, users can match the playback speed of two songs for a perfect transition. The changes are aimed at making the program easy enough for novices while letting professional DJs do more with their mixes. In addition, the Auto-Cut feature allows users to scratch music in sync with a song's beat and rhythm. Djay 3 costs $50. A free 15-day trial is available from Algoriddim. The software runs on Mac OS X 10.4 or later.-JD