Cyberattacks on U.S. military jump sharply in 2009

Cyberattacks on the U.S. Department of Defense - many of them coming from China - have jumped sharply in 2009, a U.S. congressional committee reported Thursday. That's a big jump. Citing data provided by the U.S. Strategic Command, the U.S.-China Economic and Security Review Commission said that there were 43,785 malicious cyber incidents targeting Defense systems in the first half of the year.

In all of 2008, there were 54,640 such incidents. The committee is looking into the security implications of the U.S.' trade relationship with China. If cyber attacks maintain this pace, they will jump 60 percent this year. It released its annual report to Congress Thursday, concluding that a "large body of both circumstantial and forensic evidence strongly indicates Chinese state involvement in such activities." "The quantity of malicious computer activities against he United states increased in 2008 and is rising sharply in 2009," the report states. "Much of this activity appears to originate in China." "The cost of such attacks is significant," the report notes. Attacks on department systems have been rising steadily for years.

Citing data from the Joint Task Force-Global Network Operations, the report says that the military spent $100 million to fend off these attacks between September 2008 and March 2009. A Defense Department spokesman did not have any immediate comment on the report's numbers Thursday. In 2000, for example, only 1,415 incidents were reported. The department figures are "probably more accurate now," than they were nine years ago, he said. The increase is in part due to the fact that the U.S. military is simply better at identifying cyberthreats than it used to be, said Chris Poulin, the chief security officer of Q1 Labs, and formerly a manager of intelligence networks within the U.S. Air Force. Security experts have long known that many computer attacks originate from Chinese IP (Internet Protocol) addresses, but due to the decentralized nature of the Internet, it is very difficult to tell when an attack is actually generated in China, instead of simply using Chinese servers as a steppingstone.

Who knows. Q1's Poulin says that his company's corporate clients in the U.S. are seeing attacks that come from China, North Korea, and the Middle East. "We do definitely see patterns coming from specific nation states." He said that because China's government has taken steps to control Internet usage in the country, it could probably throttle attacks if it wanted to. "China's defiantly initiating attacks," he said. "State-sponsored? But they're certainly not state-choked."

What makes a carrier green?

ABI Research today released its rankings for "green" telecom carriers, rating major North American carriers based on how much they've invested in energy-saving IT technologies, internal telework initiatives and even recyclable mobile phones. Top 12 green IT vendors What is it that makes a carrier green? ABI analyst Aditya Kaul spoke with Network World Senior Writer Brad Reed and discussed the metrics ABI used to create its ratings, which carriers fared best in the study and the benefits of green IT to consumers and businesses. We have listed a number of requirements for an operator to be green.

As for CSR, we look at green vehicle fleets, green IT, green handsets, recycling, etc. This looks at both corporate social responsibility (CSR) green initiatives and green network infrastructure initiatives. As for network infrastructure, we look at things like use of alternative energy sources at cell sites, using innovative technologies to reduce energy consumption at cell sites, as well as how involved they are with their value chain to drive environmental standards and materials to be used; their openness to declare their carbon footprint and measure it; research and innovation budget dedicated to green networks or ecological initiatives; and so forth. We look at green from the telecom perspective across handsets, recycling Wi-Fi, network infrastructure and basically the complete value chain. What sorts of technological investments does ABI rate as "green investments?" Particularly, what do you define as green network infrastructure? Green to us means technologies that are used to reduce the impact of climate change.

For example, reduction in energy consumption at a cell site accounts for a reduction in operating expenses. But more importantly, it turns out that green also means dollar savings for operators. Green network infrastructure essentially means using technologies or methods to reduce energy consumption. We also evaluate the amount of carbon reduction reported due to green mobile infrastructure initiatives. The use of innovative technologies in base stations like lower power amplifiers, remote radio heads are good examples of green technologies; the use of alternative fuels for off-grid and on-grid site; use of green core network equipment such as super switches which aggregate switches into one unit saving energy; initiatives to reduce cell site power consumption including auxiliary site equipment. The study says both AT&T and Sprint have distinguished themselves well as far as green investments go.

AT&T essentially wins on innovation and also in terms of green network infrastructure. Can you name some specific big-ticket investments they've made and how they're projected to save them energy? They have paid attention to how much energy is being consumed in their network infrastructure, have defined new metrics to measure carbon emissions, have implemented programs such as the reduction of dual-networks which saved them 207,549 metric tons of CO2 emissions. AT&T also is doing work through its research facility at Bell Labs related to technologies that could save energy in the network. They are also actively involved in smart grid projects across the United States, especially with initiatives targeting last-mile connectivity and two-way communication. On the other hand, Sprint leads in areas like green handsets with their Samsung Reclaim; recycling initiatives where they expect to recycle 90% of phones that it has sold by 2017; putting together initiatives to drive their supplier value chain to adopt greener practices; green IT initiatives including retiring servers, improving cooling efficiency of data centers, and recycling of e-waste.

Overall, the company expects to reduce carbon emissions by 15% by 2017. But while Sprint's green message is overarching and built around educating the customer, it fails to provide details on how those goals will be met. Sprint also has a clear strategy around educating the consumer, primarily driven through their green site. [Other initiatives include] recycling 50% of its operational waste; having 90% of suppliers complying with environmental standards; and securing 10% of its energy from renewable resources. The company's green message seems to lack in terms of the details of how they will achieve those goals, and falls behind in some critical areas such as green network infrastructure, where AT&T seems to have a better handle. Do they have any green IT investments of note? What do other carriers, such as say Verizon or Rogers, have to do to catch up?

The other carriers are way behind on most of the criteria measured and only have limited programs such as handset recycling which could be considered as green. Finally, what benefits, if any, are there for consumers and businesses of carriers investing in green technology? Although Verizon has some CSR initiatives like green IT, it loses out in terms of the breadth and depth of its green initiatives, which seem limited compared to competitors like Sprint and AT&T. The majority of carriers do not have an idea or any initiatives around reducing the energy consumption of their mobile network. There is the ethical standpoint of aligning with an operator that is doing its best to reduce its carbon footprint and has a strategy around doing that. I think for the consumer, the biggest advantage will be from the handset perspective in terms of recycling initiatives that the operator has, its goals around driving its supplier value chain to use environmentally friendly materials, etc. Being green is also good for shareholder value.

From the matrix perspective a lot of weight has been given to the network aspect of the operator as the mobile network makes up around 80% of the total energy consumption for an operator. Any savings that it can meet on the network costs, it can then pass onto the consumer.

Wipro, other Indian outsourcers expand in the US

Wipro, India's third largest outsourcer, is expanding its development center in Atlanta from 350 to 1,000 staff, reflecting a growing trend for Indian outsourcers to expand and hire locally in the U.S. market. India's largest outsourcer Tata Consultancy Services (TCS) said earlier this month that it was expanding its business alliance with The Dow Chemical Company, including setting up a services facility near the site of Dow's global headquarters in Midland, Michigan. The company said that 80 percent of its current 350 employees were hired locally, and includes recent graduates from reputable academic institutions in Atlanta, experienced professionals and retired army personnel.

TCS also announced that it was expanding a software services delivery center in the Cincinnati suburb of Milford, Ohio. Indian outsourcing companies are expanding both in India, and in the U.S., their key market, in anticipation of a pick up in business. Infosys BPO, the business process outsourcing subsidiary of outsourcer Infosys Technologies also said this month that it would acquire McCamish Systems, a BPO company in Atlanta focused on the insurance and financial services market. Employing staff in the U.S. is expected to go over well with the local community and politicians because of resentment in the U.S. about companies moving jobs to India and other countries, analysts said. Political considerations are evidently a factor for Indian outsourcers to expand in the U.S., said Siddharth Pai, a partner at outsourcing consultancy firm Technology Partners International (TPI) in Houston.

U.S. Senators Bernie Sanders, an Independent from Vermont, and Chuck Grassley, an Iowa Republican, last week introduced legislation, called the Employ America Act that would prohibit firms that lay off 50 or more workers from hiring guest workers. U.S. companies do not also want to be seen sending jobs abroad, he added. Certain types of work even in BPO, such as development of technology platforms for services delivery, and analytical work, require proximity to customers, he added. But there are also strong business considerations that require Indian companies to set up operations in the U.S., according to Pai. Indian outsourcers have to start looking like global players, Pai said. Japanese car makers, for example, manufacture all over the world, because some customers would like to buy locally produced goods, he added.

Plex 0.8.3 brings extensive Snow Leopard compatibility

If you want to use your Mac as a media center, there's no better app for accomplishing that objective than Plex ( Macworld rated 4.5 out of 5 mice ). This modern media center application features a gorgeous interface, automated and intelligent metadata-fetching capabilities, support for a vast variety of formats, the ability to play full 1080p high definition videos smoothly, an extensible plug-in architecture, and a host of more advanced, powerful features. That's all over, however, because Plex 0.8.3 is out and it brings a whole bunch of important bug fixes to the table, putting the software right back where it was before Snow Leopard came prowling. As with many other applications, Apple's release of Snow Leopard left Plex playing catch-up and though there has been an update or two over the past couple of months to improve compatibility with the latest big cat, Plex's relationship with Snow Leopard has remained strained at best. The most important fix involves the installation of a Candelair driver for the Apple Remote that makes it once again work smoothly with Plex (without also triggering Front Row and controlling iTunes in the background). And that isn't all.

It's called dynamic range compression and it boosts the volume of downmixed 5.1 audio. The fine folks at Plex have finally put in a feature that addresses a longstanding complaint of Plex users around the world-myself included. To enable it, go to Preferences -> System -> Audio and change the Mixdown Volume Boost setting from Disabled to Normal. Once all that is done, go back and play a movie with 5.1 channel audio. You'll also need to change the Digital Output Support setting to Force Digital and then disable the Dolby Digital (AC3) Capable Receiver and DTS Capable Receiver settings.

And turn down the volume, please. And did I mention that Plex is a free application? The update weighs in at 104MB and is worth every iota of bandwidth you spend downloading it. Well, it is, so go ahead and give it a shot.

Secrets pref pane updated for Snow Leopard

Blacktree Software has released Secrets 1.0.6, a Snow Leopard-compatible version of their preference pane which exposes hidden features on your Mac. Secrets provides handy checkboxes to turn these features on and off, and doubles as a menu of secret settings. If you've ever read a Mac tip that starts, "Open a Terminal window and type 'defaults write...'", it's highly likely that you can save yourself that effort with this preference pane.

A "Top Secrets" entry shows a list of popular options, but many more options for various applications can be selected from the application sidebar. A few caveats before you go too nuts with the Secrets features: many of the features in Mac OS X that aren't official remain "secret" because they're not entirely debugged. Clicking on any of the listed features will show you a short description of what it does in the bottom of the window; click on the More Info button for a detailed description. You can expect to see some odd behaviors if you turn some of these on, so don't tick every checkbox at once; try out a change to see if you like it (and can live with any side effects) before you go on to something else. If this is happening with several of your third-party preference panes, you can set System Preferences to stay in 32-bit mode by selecting the System Preferences.app in the Finder, choosing Get Info, and ticking the "Open in 32-bit mode" checkbox.

The Secrets preference pane requires System Preferences to run in 32-bit mode, and will prompt you to relaunch if, as per Snow Leopard default, it's in 64-bit mode when you launch it. All of your Apple 64-bit preference panes will work just fine. Secrets requires Mac OS X 10.5 or later and is a free download. [via TUAW]

Data masking secures sensitive data in non-production environments

Last week's article covered the topic of protecting data in databases from the inside out. This week's article takes look at data masking, which another way to protect sensitive data, especially as it is being copied and used in the development and testing of applications.  Data masking is the process of de-identifying (masking) specific elements within data stores by applying one-way algorithms to the data. That is, watching every action involving data as it happens, and promptly halting improper actions. The process ensures that sensitive data is replaced with realistic but not real data; for example, scrambling the digits in a Social Security number while preserving the data format.

If you don't think this is important, consider what happened to Wal-Mart a few years ago. The one-way nature of the algorithm means there is no need to maintain keys to restore the data as you would with encryption or tokenization. 10 woeful tales of data gone missing Data masking is typically done while provisioning non-production environments so that copies of data created to support test and development processes are not exposing sensitive information. Wired.com reports that Wal-Mart was the victim of a serious security breach in 2005 and 2006 in which hackers targeted the development team in charge of the chain's point-of-sale system and siphoned source code and other sensitive data to a computer in Eastern Europe. Wal-Mart at the time produced some of its own software, and one team of programmers was tasked with coding the company's point-of-sale system for processing credit and debit card transactions. Many computers the hackers targeted belonged to company programmers.

This was the team the intruders targeted and successfully hacked. According to Gartner, more than 80%t of companies are using production sensitive data for non-production activities such as in-house development, outsourced or off-shored development, testing, quality assurance and pilot programs. Wal-Mart's situation may not be unique. The need for data masking is largely being driven by regulatory compliance requirements that mandate the protection of sensitive information and personally identifiable information (PII). For instance, the Data Protection Directive implemented in 1995 by the European Commission strictly regulates the processing of personal data within the European Union. U.S. regulations such as the Gramm-Leach-Bliley Act (GLBA) and the Health Insurance Portability and Accountability Act (HIPAA) also call for protection of sensitive financial and personal data.

Multinational corporations operating in Europe must observe this directive or face large fines if they are found in violation. Worldwide, the Payment Card Industry Data Security Standard (PCI DSS) requires strict security for cardholder data. That means companies must address their use of cardholder data for quality assurance, testing, application development and outsourced systems - and not just for production systems. In order to achieve full PCI compliance, organizations must protect data in every system that uses credit card data. In the Wal-Mart case discussed above, the retailer failed to meet the PCI standard for data security by not securing data in the development environment. A lack of processes and technology to protect data in non-production environments can leave the company open to data theft or exposure and regulatory non-compliance.

Many large organizations are concerned about their risk posture in the development environment, especially as development is outsourced or sent offshore. Data masking is an effective way to reduce enterprise risk. And while encryption is a viable security measure for production data, encryption is too costly and has too much overhead to be used in non-production environments. Development and test environments are rarely as secure as production, and there's no reason developers should have access to sensitive data. Many database vendors offer a data masking tool as part of their solution suites. An alternative solution is to use a vendor-neutral masking tool.

These tools, however, tend to work only on databases from a specific vendor. Dataguise is one of the leading vendors in the nascent market of data masking. So, even if someone has copied data to a spreadsheet on his PC, dgdiscover can find it. The dataguise solution has two complementary modules. dgdiscover is a discovery tool that searches your environment (including endpoints) to find sensitive data in structured and unstructured repositories. This can be a valuable time-saving tool as data tends to be copied to more places, especially as virtual environments grow and new application instances can be deployed on demand. dgdiscover also can be used to support audits and create awareness of data repositories.

Dgmasker works in heterogeneous environments and eliminates the common practice of having DBAs create masking techniques and algorithms. The second dataguise module is dgmasker, a tool that automatically masks sensitive data using a one-way process that can't be reverse engineered. The tool preserves relational integrity between tables/remote databases and generates data that complies with your business rules for application comparability. Instead, dgmasker obfuscates the real data so that it cannot be recovered by anyone - insider or outsider - who gains access to the masked data. In short, you have all the benefits of using your actual production data without using the real data.

Data masking is an effective tool in an overall data security program. Each of these technologies plays an important role in securing data in production environments; however, for non-production environments, data masking is becoming a best practice for securing sensitive data. You can employ data masking in parallel with other data security controls such as access controls, encryption, monitoring and review/auditing.

7 critical commercial spaceflight concerns the US must tackle

There has been a positive vibe around the commercial space industry in recent month, with NASA's potential changing role, predictions of increased investments and growth of new business opportunities such as space tourism.  But such optimism needs to be tempered because there are a host of issues the government, namely the Federal Aviation Administration needs to address before commercial space operations can truly blast off, according to a report out today from watchdogs at the Government Accountability Office. The FAA faces challenges in ensuring that it has a sufficient number of staff with the necessary expertise to oversee the safety of commercial space launches and spaceport operations. NetworkWorld Extra: 10 NASA space technologies that may never see the cosmos According to the GAO report the key problems that need to be resolved include: 1. Who's minding the store? The GAO said it raised concerns in the past that if the space tourism industry developed rapidly, the FAA's responsibility for licensing reusable launch vehicle missions would greatly expand.

Many companies are designing and developing space hardware that is being tested for the first time, requiring that FAA have a sufficient level of expertise to provide oversight. The FAA's experience in this area is limited because its launch safety oversight has focused primarily on unmanned launches of satellites into orbit using expendable launch vehicles, the GAO stated. The FAA's Office of Commercial Space Transportation has hired 12 aerospace engineers, bringing its total staff to 71 full-time employees. Numerous federal agencies have responsibility for space activities, including the FAA's oversight of commercial space launches, NASA's scientific space activities, the Department of Defense's national security space launches, the State Department's involvement in international trade issues, and the Department of Commerce's advocacy and promotion of the industry. In addition the FAA has established field offices at Edwards Air Force Base and NASA's Johnson Space Center in anticipation of increased commercial space launches, the GAO report noted. 2. Who's in charge here? According to the National Academy of Sciences, aligning the strategies of the various civil and national security space agencies will address many current issues arising from or exacerbated by the current uncoordinated, overlapping, and unilateral strategies.

The GAO stated that its research identified several gaps in federal policy for commercial space launches. A national space launch strategy could identify and fill gaps in federal policy concerning the commercial space launch industry, according to senior FAA and Commerce officials, the GAO stated. For example, while FAA has safety oversight responsibility for the launch and re-entry of commercial space vehicles, agency officials told the GAO that no federal entity has oversight of orbital operations, including the collision hazard while in orbit posed by satellites and debris such as spent rocket stages or defunct satellites, the GAO stated. 3. Is it safe? If the industry begins to expand, as senior FAA officials predict, to 200 to 300 annual launches, a reassessment of the FAA's resources and areas of expertise would be appropriate. The FAA will need to determine whether its current safety regulations are appropriate for all types of commercial space vehicles, operations, and launch sites, the GAO stated. Moreover, as NASA-sponsored commercial space launches increase, the FAA's need for regulatory resources and expertise may change, the GOA stated.

The FAA has interpreted this limited authority as allowing it to regulate crew safety in certain circumstances and has been proactive in issuing a regulation concerning emergency training for crews and passengers. The FAA is responsible for the protection of the uninvolved public, which could be affected by a failed mission. However, the FAA has not developed indicators that it would use to monitor the safety of the developing space tourism sector and determine when to step in and regulate human space flight, the GAO stated. 4. Airspace considerations: NextGen, the FAA's grand plan to transform the current radar-based air traffic management system into a more automated, aircraft-centered, satellite-based system—will need to accommodate spacecraft that are traveling to and from space through the national airspace system, the GAO stated. In addition, the agency will need to develop new policies, procedures, and standards for integrating space flight operations into NextGen. As the commercial space launch industry grows and space flight technology advances, the FAA expects that commercial spacecraft will frequently make that transition and the agency will need tools to manage a mix of diverse aircraft and space vehicles in the national airspace system. For example, the agency will have to define new upper limits to the national airspace system to include corridors for flights transitioning to space and set new air traffic procedures for flights of various types of space vehicles.

While the GAO said it found no evidence that FAA's promotional activities—such as sponsoring an annual industry conference and publishing industry studies—conflicted with its safety regulatory role, it noted that potential conflicts may arise as the space tourism sector develops. 6. Government sponsored insurance? The FAA has begun to consider such issues and has developed a concept of operations document, the GAO found. 5. Conflict of interests: The GAO said in 2006 the FAA faced the potential challenge of overseeing the safety of commercial space launches while at the same time promoting the industry. In an effort to back US commercial space ventures the US government has indemnified launch operators but the law that allows for indemnification expires in December 2009, the GAO stated. For example, industry players have called for the continuation of indemnification to support US competitiveness. The continuation of such federal involvement will assist industry growth, the GAO stated.

Indemnification secures another party against risk or damage. Currently, launch operators are required to buy third-party liability insurance for up to $500 million in addition to insurance for their vehicle and its operations, and the US government provides up to $1.5 billion in indemnification. 7. When bad things happen: What will be the role of the National Transportation Safety Board (NTSB) in investigating any accidents that occur? The U.S. government indemnifies launch operators by providing catastrophic loss protection covering third-party liability claims in excess of required launch insurance in the event of a commercial launch incident. According to the GAO, the NTSB does not have space transportation explicitly included in its statutory jurisdiction, although it does have agreements with FAA and the Air Force under which it will lead investigations of commercial space launch accidents. The 2008 commissioned report on human space flight suggested that Congress may want to consider explicitly designating a lead agency for accident investigations involving space vehicles to avoid potential overlapping jurisdictions, the GAO stated.

Challenges await head of new SAP user group

The Americas' SAP Users' Group announced its new CEO on Tuesday, nearly one year after parting ways with its previous chief. Chambers assumes the role previously held by Steve Strout, who was ousted by ASUG's board in November 2008 for undisclosed reasons. Interim CEO Bridgette Chambers will take leadership of ASUG, which represents about 70,000 individuals at 2,000 member companies.

Like Strout before her, a key issue before Chambers is SAP's controversial decision to move all customers to a fuller-featured but pricier Enterprise Support service. Following months of debate, SAP and the SAP User Group Executive Network (SUGEN), an organization made up of representatives from SAP user groups around the world, agreed to develop a set of KPIs (key performance indicators) meant to prove the value of Enterprise Support. While some European user groups were especially vocal about SAP's move, ASUG officials adopted a more moderate tone in public remarks. SAP has agreed to hold off on its incremental price increase schedule for Enterprise Support "until the targeted improvements measured by the SUGEN KPI Index are met." There will be an announcement regarding the KPIs later this year, said SAP spokesman Saswato Das. However, she added, "quite frankly, SAP can drop in every value-add they can, but at the end of the day the proof is in the KPIs. This adds value or it does not.

Some customers are more accepting than others of SAP's Enterprise Support decision, given that the company had held maintenance rates steady for many years, according to Chambers. If it does not, they need to understand the customer base is not open to this. Despite these ties, ASUG has retained its independence and objectivity, Chambers said. "I believe that is the clear differentiator for ASUG," she said. "Yes, we have close relationships with SAP. Yes, there is sharing of expenses for events ... [But] I don't really think you've got another organization that possesses the level of objectivity we do." Not all ASUG members are convinced, according to one observer. "The underlying concern that many ASUG members have expressed to us in the past has been that board members' organizations may have special relationships with SAP that could be jeopardized if they were to privately or publicly confront SAP on issues," said Ray Wang, a partner with the analyst firm Altimeter Group. "It would help usher in an era of transparency if members understood what those relationships are." Chambers declined to address the issues raised by Wang, saying it is not her position to speak for ASUG's board members. "I will say that I am pleased and proud to work for a board that is so interested in all the issues that impact the SAP ecosystem," she said. "I have watched board members work tirelessly to ensure that the mission of ASUG is supported." To that end, Chambers has a number of organizational goals and challenges on her plate, including plans to refocus ASUG around "education, influence and networking," she said. If it does, both SAP and customers win. ... We will help our customers make sure they get an answer." Even as it lobbies for members' interests, ASUG has had an intimate relationship with SAP, going as far as co-locating its annual user conference with the vendor's Sapphire show. Chambers has also been conducting a series of "town hall" meetings in recent weeks to gather feedback from ASUG members. You'll be able to verify the value is approximately 'X.' Right now, the answer [to that question] is softer."

In addition, by the end of 2010, ASUG members should be able to better determine how much return they've received on their investment in a membership, Chambers said. "What I will be able to do is make it measurable.

The scourge of complexity

While the definition of cloud computing is at best a bit fuzzy, the goal of cloud computing is extremely clear. As will be explained in this newsletter, complexity is the enemy of cloud computing. 11 cloud computing companies to watch In a recent article, Geir Ramleth the CIO of Bechtel stated that he benchmarked his organization against some Internet-based companies. That goal is to make a significant improvement in the cost effective, elastic provisioning of IT services.

According to that article, "Bechtel operates 230 applications, and it runs 3.5 versions per application. When you look at Salesforce.com, not only are they running one application, but they are running one version and they are only running it in one location," Ramleth says. That means it maintains approximately 800 applications at any given time. We don't see how Bechtel or any other IT organization will be able to fundamentally reduce cost and become more agile if it continues to offer a highly complex set of services. If his organization wants to make a change to some component of the IT infrastructure that supports one of the 230 applications they operate, they have to devote additional time to quality assurance to test how the change impacts each version of the application.

In the example that Ramleth gave, his organization will incur significant extra cost in part because it has to allocate resources to support on average 3.5 versions of each application. Bechtel is not the only IT organization that supports a complex environment. We believe that any IT organization that is serious about cloud computing has to get serious about simplifying the services that it provides. Many IT organizations utilize multiple WAN providers, develop a lot of custom applications, perform extensive customization of third-party applications, and have multiple systems for functions such as enterprise resource planning or supply chain management. What do you think? Is any effort being made to simplify that environment?

Do you work in a highly complex IT environment? Write to us and let us know. If you have a few minutes to fill out the survey, it will help us to cut through the hype and understand what IT organizations are actually doing relative to cloud computing. Also, we are performing a survey to help identify the concrete steps that IT organizations are taking to implement cloud computing.

NASA says 200-yard long asteroid will miss Earth

NASA scientists have recalculated the path of a large asteroid known as Apophis and now say it has only a very slim chance of banging into Earth. Initially, Apophis was thought to have a 2.7% chance of impacting Earth in 2029. Additional observations of the asteroid ruled out any possibility of an impact in 2029. The new data were documented by near-Earth object scientists Steve Chesley and Paul Chodas at NASA's Jet Propulsion Laboratory in Pasadena, Calif. The Apophis asteroid is approximately the size of two-and-a-half football fields and updated computational techniques and newly available data indicate the probability of an Earth encounter on April 13, 2036, for Apophis has dropped from one-in-45,000 to about four-in-a million, NASA stated. They will present their updated findings at a meeting of the American Astronomical Society's Division for Planetary Sciences in Puerto Rico this week.

The information provided a more accurate glimpse of Apophis' orbit well into the latter part of this century. The recalculated trajectory came from scientists at the University of Hawaii's Institute for Astronomy in Manoa and its 88-inch telescope, located near the summit of Mauna Kea. Among the findings is another close encounter by the asteroid with Earth in 2068 with chance of impact currently at approximately three-in-a-million. NASA detects and tracks asteroids and comets passing close to Earth through its Near Earth-Object Observations Program or "Spaceguard." The program has been in the news lately as a National Academy of Sciences report said that while the space agency is tasked with watching out for huge chunks of space rocks that could smash into the earth, it has been denied the money to actually do the job. As with earlier orbital estimates where Earth impacts in 2029 and 2036 could not initially be ruled out due to the need for additional data, it is expected that the 2068 encounter will diminish in probability as more information about Apophis is acquired, NASA said. The problem is that while Congress mandated four years ago that NASA detect and track 90% of space rocks known as near earth objects (NEO) 140 kilometer in diameter or larger, it has not authorized any funds to build additional observatories, either in space or on the ground, to help NASA achieve its goals, according to a wide-ranging interim report on the topic released by the National Academy of Sciences this week.

NASA does carry out the "Spaceguard Survey" to find NEOs greater than 1 kilometer in diameter, and this program is currently budgeted at $4.1 million per year for FY 2006 through FY 2012. The report notes that United States is the only country that currently has an operating survey/detection program for discovering near-Earth objects; Canada and Germany are both building spacecraft that may contribute to the discovery of near-Earth objects. The report notes that NASA has managed to accomplish some of the killer asteroids mandate with existing telescopes but with over 6,000 known objects and countless others the task is relentless. However, neither mission will detect fainter or smaller objects than ground-based telescopes. Existing surveys are not designed for this purpose; they are designed to discover more-distant NEOs and to provide years of advance notice for possible impacts. The report goes on to state: Imminent impacts (such as those with very short warning times of hours or weeks) may require an improvement in current discovery capabilities. In the past, objects with short warning times have been discovered serendipitously as part of surveys having different objectives.

Search strategies for discovering imminent impacts need to be considered, and current surveys may need to be changed.

Compuware buys Gomez for $295 million

Compuware said Wednesday it has agreed to acquire Web application management vendor Gomez for US$295 million. Gomez's technology will work in concert with Compuware's portfolio of tools for managing the performance of on-premises applications, providing coverage "from the data center to the customer," the companies said in a statement. The transaction is expected to close in November. Such capabilities are crucial in today's IT environments, Compuware President Bob Paul said during a conference call Wednesday.

Those competitors offer only "narrow, keyhole views" into various areas, Paul claimed. For example, a retail banking transaction may begin with customers using an iPhone to connect with an online banking Web site, and end up spanning multiple third-party services, back-end databases, ISPs and mobile carriers, Paul said. "The complexity is staggering." The acquisition will bolster Compuware's ability to compete with the likes of Hewlett-Packard, CA and BMC in application performance management. Compuware will also gain fresh footholds in many of the world's largest Web properties. The acquisition announcement follows steps Gomez had taken to prepare for an IPO. The privately held vendor has 272 employees and is based in Lexington, Massachusetts. Gomez has about 2,500 customers, including Google, Facebook, Yahoo and Amazon.com, according to its Web site. Compuware is not planning any significant personnel changes, according to a statement.

Gomez's current product road map will also "essentially remain unchanged," and the Gomez brand will be retained, although plans to integrate the vendors' offerings are afoot, Compuware said.

Microsoft shows off Bing tool for measuring ad effectiveness

Microsoft on Monday demonstrated a new tool for its Bing search engine that will allow advertisers to measure the effectiveness of their ads with online users. Mehdi pointed out that statistics show that 39 percent of Web users do 65 percent of the online searches, so it would be beneficial for advertisers to see which of those "heavy users" are targeting certain ads, versus which ads are favored by "light users." The tool Microsoft created shows where the interest in a marketing or advertising campaign is specifically coming from, he said. Speaking at the IAB MIXX Conference and Expo 2009 in New York on Monday, Yusuf Mehdi, senior vice president of Microsoft's Online Audience Business group, showed off what he called a "user-level targeting" tool that allows Microsoft to see which search-based ads that appear in the Bing search engine are getting the most traffic and from where. "What we're doing with Bing for vigorous measurement is we're matching the exact ad online with the exact user," he said. This measuring ability for Bing was demonstrated as part of Mehdi's presentation, in which he discussed how Microsoft is applying lessons it's learned from studying advertising campaigns and creating technology to reflect that learning.

You have to pick and focus." Microsoft revamped and rebranded its Live Search engine "Bing" in June, and making it more effective for search advertising is something the company continues to work on, Mehdi said. One of those lessons was what he characterized as "relentless measurement and optimization" to find out what ads are most effective so they can be better targeted to their proper audience. "One of the big things is trying to build a loyal fan base for the product," he said. "You can't just go out and put your message everywhere. It was unclear from Mehdi's presentation whether this technology is available for advertisers using Bing today or whether it's just something Microsoft is using internally. This kind of ability to measure what kinds of online advertising is working with users is becoming essential as more and more business is being done on the Web. A representative from Microsoft's public relations firm, Waggener Edstrom, declined to answer follow-up questions about the technology or his presentation.

In fact, Microsoft competitor Adobe Systems - an executive from which spoke before Mehdi on Monday - last week said it was purchasing Web analytics company Omniture to build measuring technology directly into Adobe's tools for creating online media.

Bank sues Google for ID of Gmail user

A bank that inadvertently sent confidential account information on 1,325 of its customers to the wrong Gmail address is suing Google for the identity of the Gmail account holder. According to court documents, the bank in August received a request from one of its customers asking for certain loan statements to be sent to a third-party. The case, filed in the U.S. District Court for the Northern District of California, involves Rocky Mountain Bank of Wyoming.

An employee of the bank, responding to the request, sent the documents to the wrong Gmail address. When it discovered the error, the bank immediately sent an e-mail to the Gmail address asking the recipient to delete the previous email and the attachment. In addition to the requested loan information, the bank employee also inadvertently attached a file containing names, addresses, tax identification numbers and other details on 1,325 account holders to the same address. The bank also asked the recipient to contact the bank to discuss what actions had been taken to comply with the bank's request. When Google refused to provide any information on the account without a formal subpoena or court order, the bank filed a complaint asking the court to force Google to identify the account holder.

When it received no reply, the bank sent an e-mail to Google asking whether the Gmail account was active or dormant and also what it could do to prevent unauthorized disclosure of the inadvertently leaked information. Rocky Mountain Bank also requested that its complaint and all of the pleadings and filings in the case be sealed. U.S. District Court Judge Ronald Whyte dismissed that request, saying there was no need for the proceedings to be sealed. "An attempt by a bank to shield information about an unauthorized disclosure of confidential customer information until it can determine whether or not that information has been further disclosed and/or misused does not constitute a compelling reason," Whyte wrote last week. The bank said it hopd to prevent unnecessary panic among its customers and a "surge of inquiry from its customers." The bank argued that if the complaint and motion papers are not sealed, all of its customers would learn of the inadvertent disclosure. This is the third time in recent weeks that Google has faced a similar issue. The man alleged that the contributors to the paper had unfairly linked him to government corruption.

Earlier this month, the Associated Press reported that a resort developer in Miami had obtained a court order requiring Google to disclose the identities of anonymous contributors to an online newspaper in the Turks and Caicos Islands. In that case, Google indicated that it would disclose the data only after first informing the paper about the request and giving it a chance to appeal for the court order to be quashed. In the other incident, a court in New York compelled Google to disclose the identity of a blogger who had made disparaging comments about a Vogue model in her blog "Skanks in NYC."

NASA says 200-yard long asteroid will miss Earth

NASA scientists have recalculated the path of a large asteroid known as Apophis and now say it has only a very slim chance of banging into Earth. Initially, Apophis was thought to have a 2.7% chance of impacting Earth in 2029. Additional observations of the asteroid ruled out any possibility of an impact in 2029. The new data were documented by near-Earth object scientists Steve Chesley and Paul Chodas at NASA's Jet Propulsion Laboratory in Pasadena, Calif. The Apophis asteroid is approximately the size of two-and-a-half football fields and updated computational techniques and newly available data indicate the probability of an Earth encounter on April 13, 2036, for Apophis has dropped from one-in-45,000 to about four-in-a million, NASA stated. They will present their updated findings at a meeting of the American Astronomical Society's Division for Planetary Sciences in Puerto Rico this week.

The information provided a more accurate glimpse of Apophis' orbit well into the latter part of this century. The recalculated trajectory came from scientists at the University of Hawaii's Institute for Astronomy in Manoa and its 88-inch telescope, located near the summit of Mauna Kea. Among the findings is another close encounter by the asteroid with Earth in 2068 with chance of impact currently at approximately three-in-a-million. NASA detects and tracks asteroids and comets passing close to Earth through its Near Earth-Object Observations Program or "Spaceguard." The program has been in the news lately as a National Academy of Sciences report said that while the space agency is tasked with watching out for huge chunks of space rocks that could smash into the earth, it has been denied the money to actually do the job. As with earlier orbital estimates where Earth impacts in 2029 and 2036 could not initially be ruled out due to the need for additional data, it is expected that the 2068 encounter will diminish in probability as more information about Apophis is acquired, NASA said. The problem is that while Congress mandated four years ago that NASA detect and track 90% of space rocks known as near earth objects (NEO) 140 kilometer in diameter or larger, it has not authorized any funds to build additional observatories, either in space or on the ground, to help NASA achieve its goals, according to a wide-ranging interim report on the topic released by the National Academy of Sciences this week.

NASA does carry out the "Spaceguard Survey" to find NEOs greater than 1 kilometer in diameter, and this program is currently budgeted at $4.1 million per year for FY 2006 through FY 2012. The report notes that United States is the only country that currently has an operating survey/detection program for discovering near-Earth objects; Canada and Germany are both building spacecraft that may contribute to the discovery of near-Earth objects. The report notes that NASA has managed to accomplish some of the killer asteroids mandate with existing telescopes but with over 6,000 known objects and countless others the task is relentless. However, neither mission will detect fainter or smaller objects than ground-based telescopes. Existing surveys are not designed for this purpose; they are designed to discover more-distant NEOs and to provide years of advance notice for possible impacts. The report goes on to state: Imminent impacts (such as those with very short warning times of hours or weeks) may require an improvement in current discovery capabilities. In the past, objects with short warning times have been discovered serendipitously as part of surveys having different objectives.

Search strategies for discovering imminent impacts need to be considered, and current surveys may need to be changed.

Start-up unveils storage platform for large-scale Web applications

A storage company emerged from stealth mode this week with software designed to efficiently manage the file serving needs of Internet applications such as social networks, online ad serving and software-as-a-service.  Nine data storage companies to watch MaxiScale announced the Flex Software Platform, which is installed on commodity gear, such as a bank of Apache Web servers. Retrieving a small file with the MaxiScale system requires just one I/O operation, a feature that eliminates bottlenecks caused by systems that require multiple I/O operations for each small file retrieval, says IDC storage analyst Noemi Greyzdorf. "They built a very interesting file system that handles small files – files that are one megabyte or smaller – incredibly efficiently," Greyzdorf says Configurations start with as few as four nodes but can scale up to 50,000 servers, the company says. The goal is to improve performance and reduce cost, space and power requirements for Web companies that have to deal with large numbers of small files. "We think people deploying Web applications have been paying too much money and we're out to change that," says Gary Orenstein, vice president of marketing for MaxiScale. Instead of using expensive storage boxes with interconnects like InfiniBand or Fibre Channel, MaxiScale recommends using Flex with 2TB SATA drives and says the Flex system relies on IP and Ethernet connections. "We're using standards-based, commodity hardware for everything," Orenstein says.

Maxiscale's first publicly named customer is AdMob, a mobile advertising marketplace that has served more than 110 billion ad impressions in the last three years. Flex uses a patent-pending Peer Set architecture that replicates file data and metadata across SATA drives, allowing for load balancing and resiliency to multiple hardware failures. Based in Sunnyvale, Calif., and founded in 2007, MaxiScale has $17 million in venture financing from investors NEA, El Dorado Ventures and Silicon Valley Bank. Flex software is available now and pricing starts at $6,000 for four nodes allowing up to 32TB of storage. MaxiScale was co-founded by CEO Gianluca Rattazzi, who previously founded Meridian Data, Parallan, P-Com and BlueArc; and CTO Francesco Lacapra, who previously held executive roles at Olivetti, Quantum and BlueArc. Follow Jon Brodkin on Twitter.

DHS faces challenge in hiring 1,000 security experts

The U.S. Department of Homeland Security's effort to hire some 1,000 new cybersecurity experts could hit a wall because such skills are increasingly hard to find, according to security experts. Alan Paller, research director with the SANS Institute, a Bethesda, Md.-based computer training and certification organization, said DHS has a critical need for strong technicsal skills among its security professionals who handle tasks like intrusion analysis, malware reverse engineering, security auditing, secure code analysis, penetration testing and forensics. "That's what DHS needs and is trying to hire," he said, adding that the agency faces strong competition for such skills from other government agencies like the National Security Agency along with private sector companies. DHS Secretary Janet Napolitano announced last week that the U.S. Offices of Personnel Management and Management and Budget has agreed to allow the agency hire up to 1,000 security experts over the next three years to ramp up its cybersecurity efforts. The problem for all of the organizations, Paller said, is there aren't enough security professionals to meet the need. "DHS will be forced to hire weaker people and the cost of the very strong people will skyrocket," Paller predicted.

The agency needs to address issues that have contributed to a relatively high turnover rate among mid-level to senior cybersecurity professionals at the agency, Kreitner said. Clint Kreitner, president and CEO at the Center for Internet Security (CIS), added that DHS must also create "leadership stability and consistency" before it can hope to attract the right kind of talent. The DHS' National Cyber Security Division, which oversees day-to-day cybersecurity efforts, has seen a "tremendous amount of turnover," Kreitner said. "It has been a revolving door." Many of the security professionals have gone on tothe private sector. "My guess is they don't feel like they are contributing as much as they would like to. President Barack Obama's continuing delay in appointing a White House cybersecurity coordinator has also been "a source of discouragement to many who are wondering whether the realty will match the rhetoric," on cybersecurity matters, Kreitner said. If people felt they were being effective they would stay longer," he said.

Obama announced the creation of the position in May. Pescatore conceded that an agency with the DHS mission has more information security needs than the typical company, the plan to hire 1,000 new workers is "incredibly silly and hard to do. John Pescatore, an analyst with Gartner Inc in Stamford, Conn., questioned whether the DHS, which employs some 200,000 workers overall, really needs 1,000 new security experts to meet its requirements. "For a typical private industry company of that size, you might see at most 200 information security staff -1,000 would be unheard of," he said. They don't need that many. [Even] if they did, they would be much better off training existing staff to become skilled in information security." he said. The NSA increasing security leadership has led to increasing calls for the DHS to oversee the defense of government and commercial interests in cyber space.

The announcement about the DHS hiring plans come amid continuing questions about cybersecurity leadership . Critics have said that the NSA and the Department of Defense continue to exert more leadership on cybersecurity issues.

iTunes competitor doubleTwist gains Amazon MP3 store

We mentioned doubleTwist back in February when the Mac version debuted, but the latest update to the media management software has added a major new feature that positions it directly as an iTunes competitor: you can now buy music directly in doubleTwist from the Amazon MP3 store. Music is all that's on offer here, and as on Amazon's Web site, you can play 30 second samples of tracks and purchase individual songs or albums. The store bears a passing resemblance to the iTunes Store, albeit trimmed down in both appearance and content: it lacks the same glut of information and media that Apple's offering has accumulated throughout the years. Upon buying media, doubleTwist will automatically download the files and then allow you to sync them to any device that the software supports.

It also supports digital cameras and portable game consoles like the PSP. However, while the Windows version supports syncing with the iPod and the iPhone, this capability is still not available on the Mac, though doubleTwist says it's scheduled for a future update. As always, doubleTwist's major strength is acting as a conduit for transferring music, photos, and videos onto a veritable cornucopia of multimedia devices that aren't supported by iTunes, including BlackBerry, Android, and Windows Mobile devices. While doubleTwist isn't nearly as mature as iTunes-basic amenities like shuffle and a track time display are missing-its underlying framework seems sound. So far, the store is only available in the Mac version of doubleTwist, but the Windows version is scheduled for release in the next week or so. I can't help but think that Palm would have been better off hitching its wagon to a legitimate program like this for the Pre's syncing needs, instead of repeatedly attempting to make an end run around Apple by hacking its way into iTunes.

The program is a free download, but requires Mac OS X 10.5 or later.

New ICANN agreement runs into criticism

A new agreement between the Internet Corporation for Assigned Names and Numbers (ICANN) and the U.S. Department of Commerce that creates international oversight of the nonprofit operator of the Internet's domain name system may not provide enough accountability, some critics said. ICANN and the U.S. Department of Commerce (DOC) announced the new agreement on the day an 11-year series of agreements expired. The agreement, announced Wednesday, seemed to enjoy widespread support, but some critics questioned how new review teams overseeing ICANN would be independent and whether the new agreement represented average Internet users.

Under those agreements, the U.S. government provided primary oversight of ICANN. One of the main changes in the new agreement, called an Affirmation of Commitments, is the creation of new review panels, which would check ICANN's compliance with the agreement every three years. They're likely to produce the politics that already exist within ICANN." ICANN has a long history of disagreement among stakeholder groups and calls by other nations for the U.S. to give up its oversight role. Volunteers would serve on those review teams, as would independent experts and representatives of the ICANN board of directors and the DOC. The problem is that ICANN's chairman or CEO and the chairman of ICANN's Governmental Affairs Committee (GAC), selected by all the nations involved with ICANN, would have the final say on the makeup of those review teams, said Brenden Kuerbis, operations director the Internet Governance Project, a group of academics focusing on Internet governance issues. "The review panels are not external to ICANN," Kuerbis said Thursday at an ICANN forum hosted by the Congressional Internet Caucus. "They're selected by the very people responsible for what ICANN does. ICANN's major problem isn't a lack of oversight, it's a lack of clearly defined rules for the organization and standards to measure performance, Kuerbis added. "If these rules don't exist - and they still don't - the review panels ... can just become another layer of politics and second-guessing, superimposed on what is already a messy and pretty diffuse process," he said. There will be public comment on membership of the review teams, and ICANN's board and CEO don't control GAC, he said. "It's going to be extremely hard [for ICANN] to game the process," he said.

However, ICANN Vice President Paul Levins disagreed that the review teams will be made up of ICANN allies. Another criticism of the new agreement is that it was negotiated between ICANN and DOC in secret, even as the agreement calls on ICANN to be accountable and transparent to the public and to use a bottom-up decision-making process. "Whatever deliberation occurred prior to the approval of this 'affirmation of commitments' was entirely secret - except for those favorite friends ICANN chose to invite into the smoke-filled room, or to whom the deliberations or decisions were leaked," Edward Hasbrouck, a travel blogger and ICANN critic wrote on ICANNwatch.org, an ICANN watchdog site. "In fact, the completely secret, nontransparent and unaccountable way in which these 'commitments' were adopted is clear and compelling evidence of ICANN's continuing 'lack' of any actual commitment to these principles, or indeed to any transparency or accountability; its continuing commitment to lie - as loudly and as prominently as it can - about its lack of accountability and transparency; and the continuing need for 'real' transparency and accountability," the blog post continued. It's clear that ICANN received input from outside groups, and the agreement addressed major concerns about U.S. control over ICANN, said Steve DelBianco, executive director of NetChoice, an e-commerce trade group and frequent ICANN critic. But other ICANN watchers offered support for the new agreement. The new agreement gives the U.S. government a continued role in ICANN oversight, but it spreads out the oversight to other governments and the private sector, he said. "ICANN's independence day will be known as Sept. 30, 2009," DelBianco said. "[The agreement] is very clever in the way it balances some of those forces that were speaking out." GAC, which has complained of not having enough oversight of ICANN, will now have more control, he said. "The way we relieved the pressure [on ICANN] was to give governments more say," he said.

Other supporters of the new agreement included registrar Go Daddy, the Software and Information Industry Association, and U.S. Representative Henry Waxman, a California Democrat and chairman of the House Energy and Commerce Committee. "This agreement is a perfect example of how a public-private partnership can work to the advantage of all stakeholders," Waxman said in a statement. "It will help insure that the Internet remains stable and secure for the people around the world who use it for work, study, entertainment, or to stay in touch with family and friends."

Apple missed security boat with Snow Leopard, says researcher

Apple missed a golden opportunity to lock down Snow Leopard when it again failed to fully implement security technology that Microsoft perfected nearly three years ago in Windows Vista, a noted Mac researcher said today. Miller was disappointed that Apple didn't improve ASLR from Leopard to Snow Leopard. "I hoped Snow Leopard would do full ASLR, but it doesn't," said Miller. "I don't understand why they didn't. But Apple missed an opportunity with Snow Leopard." Even so, Miller said, Apple made several moves that did improve Mac OS X 10.6's security. Dubbed ASLR, for address space layout randomization, the technology randomly assigns data to memory to make it tougher for attackers to determine the location of critical operating system functions, and thus make it harder for them to craft reliable exploits. "Apple didn't change anything," said Charlie Miller, of Baltimore-based Independent Security Evaluators, the co-author of The Mac Hacker's Handbook , and winner of two consecutive "Pwn2own" hacker contests . "It's the exact same ASLR as in Leopard, which means it's not very good." Two years ago, Miller and other researchers criticized Apple for releasing Mac OS X 10.5, aka Leopard, with half-baked ASLR that failed to randomize important components of the OS, including the heap, the stack and the dynamic linker, the part of Leopard that links multiple shared libraries for an executable. Two that stand out, he said, were its revamp of QuickTime and additions to DEP (data execution prevention), another security feature used in Windows Vista. "Apple rewrote a bunch of QuickTime," said Miller, "which was really smart, since it's been the source of lots of bugs in the past." That's not surprising, since QuickTime supports scores of file formats, historically its weak link.

How Apple's rewrite of QuickTime for Snow Leopard plays out, of course, is uncertain, but Miller was optimistic. Last week, in fact, Apple patched four critical QuickTime vulnerabilities in the program's parsing of various file formats. An exploit of a vulnerability in Leopard's QuickTime that he had been saving doesn't work in the version included with Snow Leopard, Miller acknowledged. "They've shaken out hundreds of bugs in QuickTime over the years, but it was still really smart of them to rewrite it," said Miller. I don't think anyone would miss them." Snow Leopard's other major security improvement was in DEP, which Miller said has been significantly enhanced. If it was up to him, though, Miller would do even more. "I'd reduce the number of file formats from 200 or so to 50, and reduce the attack surface. DEP is designed to stop some kinds of exploits - buffer overflow attacks, primarily - by blocking code from executing in memory that's supposed to contain only data.

That's because if [the hacker] can hit 90% of the machines out there, that's all he's gonna do. Microsoft introduced DEP in Windows XP Service Pack 2 (SP2), and expanded it for Vista and the upcoming Windows 7 . Put ASLR and DEP in an operating system, Miller argued, and it's much more difficult for hackers to create working attack code. "If you don't have either, or just one of the two [ASLR or DEP], you can still exploit bugs, but with both, it's much, much harder." Because Snow Leopard lacks fully-functional ASLR, Macs are still easier to compromise than Windows Vista systems, Miller said. "Snow Leopard's more secure than Leopard, but it's not as secure as Vista or Windows 7," he said. "When Apple has both [in place], that's when I'll stop complaining about Apple's security." In the end, though, hacker disinterest in Mac OS X has more to do with numbers, as in market share, than in what protective measure Apple adds to the OS. "It's harder to write exploits for Windows than the Mac," Miller said, "but all you see are Windows exploits. It's not worth him nearly doubling his work just to get that last 10%." Mac users have long relied on that "security-through-obscurity" model to evade attack, and it's still working. "I still think you're pretty safe [on a Mac]," Miller said. "I wouldn't recommend antivirus on the Mac." But the missed opportunity continues to bother him. "ASLR and DEP are very important," Miller said. "I just don't understand why they didn't do ASLR right," especially, he added, since Apple touted Snow Leopard as a performance and reliability update to Leopard. "If someone else is running your machine, it's more unreliable than if you're running it," Miller concluded.

iStockphoto guarantees its collection

Starting today, iStockphoto, the micropayment royalty-free image, video, and audio provider, will legally guarantee its entire collection from copyright, moral right, trademark, intellectual property, and rights of privacy disputes for up to $10,000. The new iStock Legal Guarantee, delivered at no cost to customers, covers the company's entire 5 million-plus collection. Recently however, Vivozoom, another microstock company, took a similar action to guarantee its collection. Additional coverage for an Extended Legal Guarantee totaling $250,000 is available for the purchase of 100 iStock credits. "Our first line of defense has always been-and continues to be-our rigorous inspection process," said Kelly Thompson, chief operating officer of iStockphoto. "The Legal Guarantee is simply an added layer of protection for our customers, many of whom are using microstock more than ever before." Although common for traditional stock houses, such legal guarantees have not been standard in microstock because of the low prices. iStock says that files purchased and used in accordance with its license will not breach any trademark, copyright, or other intellectual property rights or rights of privacy.

And, if a customer does get a claim, iStock will cover the customer's legal costs and direct damages up to a combined total of $10,000. iStock customers can increase their coverage for legal fees and direct damages up to a combined total of $250,000 by purchasing the Extended Legal Guarantee via the iStock credits (which costs between $95 and $138). iStock expects that this program will be popular with a very small percentage of sophisticated media buyers with very specific needs, and considers it to be a value-added service to customers rather than a major source of revenue.

You've got questions, Aardvark Mobile has answers

Aardvark has taken a different tack with search. And now the people behind Aardvark are bringing that same approach to the iPhone and iPod touch. The online service figures it's sometimes more productive to ask a question of an actual person-usually someone from within your social network-rather than brave the vagaries of a search engine and its sometimes irrelevant answers. Aardvark Mobile actually arrived in the App Store nearly a week ago.

Aardvark Mobile tackles the same problem as the Aardvark Web site-dealing with subjective searches where two people might type in the same keywords but be searching for two completely different things. "Search engines by design struggle with these types of queries," Aardvark CEO Max Ventilla said. But developer Vark.com waited until Tuesday to take the wraps off the mobile version of its social question-and-answer service. What Aardvark does is tap into your social networks and contacts on Facebook, Twitter, Gmail, and elsewhere to track down answers to questions that might otherwise flummox a search engine-things like "Where's a good place to eat in this neighborhood?" or "Where should I stay when I visit London?" With Aadvark's Web service, you'd send a message through your IM client to Aardvark; the service then figures out who in your network (and in their extended network) might be able to answer the question and asks them on your behalf. The majority of questions are answered in less than five minutes. Ventilla says that 90 percent of the questions asked via Aardvark get answered.

The iPhone version of Aardvark works much the same way. The service pings people for an answer, and sends you a push notification when there's a reply. Instead of an IM, you type a message directly into the app, tag it with the appropriate categories, and send it off to Aardvark. In previewing the app, I asked a question about affordable hotels in Central London-two responses came back within about three minutes from other Aardvark users. If you shake your mobile device when you're on the Answer tab, Aardvark Mobile looks up any unanswered questions that you may be able to provide a response for (while also producing a very alarming aardvark-like noise). "We think Aardvark is particularly well-suited to mobile, and especially the iPhone given how rich that platform is to develop for," Ventilla said.

In addition to push notifications, Aardvark Mobile also taps into the iPhone's built-in location features to automatically detect your location-a feature that can help when you're asking about local hotspots. You don't have to already be using Aardvark's online service to take advantage of the mobile app. Aardvark Mobile requires the iPhone OS 3.0. The free Aardvark Mobile app lets you set up a profile on your iPhone or iPod touch; Facebook Connect integration helps you instantly build up a network of friends who are also using the service.

DOJ expands review of planned Microsoft-Yahoo agreement

The U.S. Department of Justice has asked Microsoft Corp. and Yahoo Inc. to hand over more information regarding their proposed search partnership. Nina Blackwell, a spokeswoman for Yahoo, said both companies are cooperating with federal regulators. "[We] firmly believe that the information [we] will be providing will confirm that this deal is not only good for both companies, but it is also good for advertisers, good for publishers, and good for consumers," she added. A Microsoft spokesman confirmed in an e-mail to Computerworld today that the DOJ requested additional information, but added that it came as no surprise. "As expected, we received additional request for information about the agreement earlier this week," wrote the spokesman, Jack Evans. "When the deal was announced, we said we anticipated a close review of the agreement given its scope, and we continue to be hopeful that it will close early next year." Evans declined to disclose exactly what information the DOJ is looking for.

Microsoft and Yahoo announced late in July that they had finalized negotiations on a deal that will have Microsoft's Bing search engine powering Yahoo's sites, while Yahoo sells premium search advertising services for both companies. Microsoft officials contend that the deal with Yahoo will improve competition in the search market. The partnership, which was a year-and-a-half in the making , is aimed at enabling the companies to take on search behemoth Google as a united force. Matthew Cantor, a partner at Constantine Cannon LLP in New York and an experienced antitrust litigator, disagrees. He argues that since Yahoo will cease being a competitor in the search market, the DOJ is likely to say the Microsoft/Yahoo partnership is anticompetitive . In an interview today, Cantor applauded the DOJ's request for more information. "Most deals clear without a request for additional information.

Cantor said last month that when Yahoo's own search tool disappears, only two major search engines will remain - Google and Microsoft's Bing. This is not run-of-the-mill," said Cantor. "The government believes there are potential antitrust concerns raised here. Nonetheless, Blackwell told Computerworld that Yahoo is still hopeful the deal will close early next year. They would only request additional information if there was some kind of presumption that the deal will cause antitrust effects." Cantor added that he thinks it could take months for Microsoft and Yahoo to pull this new information together, perhaps until the end of this year.

SAP offers LinkedIn recruitment tools to channel partners

SAP has inked a deal with LinkedIn that will provide the software vendor's channel partners with special tools and services for the popular business social-networking and careers site.

The move is the first such agreement LinkedIn has formed with a software vendor, according to a statement. It is also the first instance of collaboration between the companies following an investment SAP's venture capital arm made in LinkedIn last year.

The offer is available globally and is aimed at channel partners with up to 1,000 employees. It includes a special tool that helps partners find, track and contact appropriate candidates, as well as access to a job posting service on the site. SAP's announcement indicates that partners will get a discount, but pricing information wasn't immediately available Thursday.

Some 140,000 SAP consultants use LinkedIn, according to SAP.

But it isn't the only option for finding talent. Career sites such as Dice.com attract their share of SAP professionals as well, and many systems integrators and staffing companies maintain their own candidate databases.

In addition, a lot can be done with LinkedIn using only its complimentary tools, according to Jon Reed, an independent analyst who tracks the SAP skills market. "Why pay LinkedIn to search things when you can do so much without paying LinkedIn anything?"

That said, the true SAP "all-stars" in the marketplace may not be actively looking for work, Reed noted. "SAP is no different than any other industry or area. Top performers rarely float résumés on job boards," he said. LinkedIn's social-networking components provide "a pretty powerful platform" for reaching out to such individuals, Reed said.

Meanwhile, the dismal economy is prompting companies to spend less money on recruitment efforts. But that doesn't mean companies aren't still looking for SAP talent, given the constantly shifting areas of demand for new skills as the vendor makes acquisitions and releases new products, according to Reed.

"Even with 50 people on your bench, you may not have what you need right now," he said.

Apogee introduces GiO audio interface and foot controller

Apogee has introduced GiO, an audio interface and foot controller for the Mac, designed to work with Apple's new Logic Studio and GarageBand '09.

GiO gives guitar players hands-free control over recording and many new features in Logic Studio, including Amp Designer, Pedalboard, MainStage 2, Playback, and Loopback. GiO also works with GarageBand '09, making it easy for GarageBand users to connect their guitar to their Mac and control recording functions and stompbox effects with their feet.

"Our Mac-only focus has allowed us to offer highly refined, innovative products, and give our customers an incredible experience." said Betty Bennett, CEO of Apogee. "With GiO, users of the new Logic Studio and GarageBand '09 have amps, stompboxes, recording control and legendary Apogee sound right at their feet, whether they're in the studio, at home, or on stage."

GiO's instrument input is specially designed for guitar-its 1/4-inch input features the company's instrument preamp and converters. These let you hear and record your guitar's true tone directly to Apple's GarageBand, Logic (both Studio and Express) and MainStage. Previous and Next preset select buttons let users audition and toggle between guitar amps and effects presets in the audio programs.

GiO is optimized for all pickup configurations from the single coil to sophisticated active electronics setups. GiO's five transport control buttons let guitar players record, play, stop, and quickly navigate through a project without taking their hands off the guitar. The same buttons can be assigned to functions selected by the user with MainStage 2, Logic Pro's live performance app. Guitar players can use the five stompbox buttons to individually control their favorite Pedalboard effects in Logic Studio while GiO's color indicators automatically adjust to match the colors displayed onscreen. GiO's Expression control pedal input allows access to traditional effects like wah-wah, and volume.

Connect your headphones, powered monitors, or instrument amplifier to Apogee converters via GiO's 1/4-inch stereo output to listen to your guitar and mix your composition.

GiO is fully powered by a USB connection to the Mac and built into an aluminum case. It requires a Power Mac G5 or Intel Mac running OS X 10.5.7 or later.

GiO will be available in September for $395.

Cloud interoperability remains wispy, but progress being made

Cloud computing is supposed to make IT more flexible, efficient and easier to manage. But the cloud model threatens to introduce a whole new layer of complexity, unless vendors and industry groups promote interoperability standards that let cloud networks work together.

Vendor competition is a potential impediment, but most major cloud vendors are at least talking about interoperability, including the ability to move workloads from one cloud to another.

"So far, it's closer to lip service, but there are a couple of efforts moving in this direction," says Forrester Research analyst James Staten.

Staten believes the most impressive project is one spearheaded by Distributed Management Task Force (DMTF), which has signed up vendors such as AMD, Cisco, Citrix, EMC, HP, IBM, Intel, Microsoft, Novell, Red Hat, Savvis, Sun Microsystems  and VMware for an effort called the Open Cloud Standards Incubator.

The group will let individual vendors demonstrate interoperability between two clouds and document methodologies to ensure that interoperability, according to Staten. The group thus tackles interoperability on a case-by-case basis, but the hope according to Staten is that this process will spur the development of industry-wide standards over time.

Cloud interoperability can mean many things, and users and vendors may not agree on which types of interoperability are most important. But some commonly discussed goals include the following:

* Moving virtual machines and workloads from one cloud compute service to another.

* Single sign-on for users who access multiple cloud services.

* Ability to deploy and provision resources from multiple cloud services with a single management tool.

* Letting one application span multiple cloud services (such as a storage service from one cloud provider and compute capacity from another).

* Allowing data exchange between clouds.

* Letting a private cloud application seamlessly obtain resources from a public cloud when excess capacity is needed.

In more general terms, enterprises want to avoid using a plethora of cloud services with different interfaces, and don't want to be locked in to a particular cloud by technologies that prevent the movement of workloads from one to another.

Amazon has become perhaps the best-known vendor providing both compute and storage services in the cloud model, and the company's APIs have been called "de facto" standards by those who have expressed hope that Amazon will release them as open source software.

Many companies are supporting the Open Cloud Manifesto, which intends to establish a set of core principles that all cloud providers should follow. But notable absences include Amazon and Microsoft.

Several vendors are attempting to tie together different cloud services in ways that make them easier to use for IT shops, but each effort seems to have some limitation.

VMware, for example, is calling its latest virtualization platform a "cloud operating system" and promising that enterprises can use the software to build private clouds and connect them to public computing resources. But the software only works with hardware that has been virtualized using VMware technology, and the cloud interoperability is only possible if the cloud provider is using VMware. The latter condition eliminates such big players as Amazon and Google.

The Burton Group analyst firm has partnered with vendors to demonstrate single sign-on across real-world applications such as Salesforce.com, Google Apps and Cisco's WebEx, using tools based on standards such as SAML and WS-Federation.

"We're hearing from our clients that many of the applications are moving off premises and into the cloud, and it's putting a strain onto attempts to present as few authentications to users as possible," says Burton Group analyst Gerry Gebel.

Cloud application vendors will have to adopt a standards-based approach to make single sign-on ubiquitous, Gebel says. While he is optimistic that most vendors will come on board, he says the list of vendors following the standards-based approach is probably shorter than the list of those that are not.

In another interoperability effort, the vendor AppZero has created virtual appliances that allow the movement of server-based applications from private data centers to public clouds, and from one cloud to another, such as from Amazon Web Services to GoGrid.

Rival vendor 3Tera says this approach doesn't account for multi-tier applications that span many virtual machines. 3Tera says it encapsulates all the components of an application, including firewall, load balancer, Web and application servers, databases and operating system into one entity that can be easily moved from one cloud to another. But this portability only works if each cloud was built using the 3tera platform.

"The problem right now is there is no interoperability among any clouds," says Bert Armijo, senior vice president of sales and product marketing for 3Tera. Say you write an application specifically for Amazon's Elastic Compute Cloud. Since the code is written specifically for Amazon's platform, "that application is going nowhere," Armijo adds.

Going forward, vendors will have to agree upon a "common set of standards and interfaces" to ensure true interoperability, says IBM cloud computing software chief Kristof Kloeckner.

"In cloud terms, there are some services you receive through a service provider, some services you deliver through an internal cloud, and some that you normally deliver with an internal cloud but you may want overflow capacity for peak times," Kloeckner says. "All this movement of services, applications, and combination of applications only works if all the providers adhere to a common set of standards and interfaces." But today, most public compute clouds are based on virtual machine models that aren't compatible with each other, Kloeckner says.

Mark O'Neill, the CTO of Vordel, says enterprises should be able to use a best-of-breed cloud approach, having applications that span different providers of storage, compute and application hosting platforms.

"Vordel often speaks with customers wishing to make use of best-of-breed cloud services – for example using Amazon for external storage (the S3 service) while using Force.com to pull customer order information into behind-the-firewall applications," ONeill writes in an e-mail. "Key issues are allowing single sign-on across cloud services, for the same application, as well allowing a service running one cloud platform (e.g. a hosted application on Amazon Elastic Compute Cloud) to call a service hosted by another cloud provider (e.g. Google) in a managed manner."

Vordel's XML Gateway is designed to link single applications to multiple cloud services and provide the single-sign on capability mentioned by O'Neill, all without requiring onerous work on the part of developers.

While Vordel's offering is likely useful for many types of customers, O'Neill says his company hasn't tackled the challenge of trying to move applications from one cloud to another, saying "generally that's an unsolved problem across the board."

But there is room for optimism, says Robert Grossman, chairman of the newly formed Open Cloud Consortium and director of the Laboratory for Advanced Computing (LAC) and the National Center for Data Mining (NCDM) at the University of Illinois at Chicago.

Grossman says enterprises with private clouds should be able to obtain extra computing resources from any public cloud without changing the API. Even today, the open source Eucalyptus private cloud software is largely compatible with the Amazon API, making it easy to get excess capacity from the Amazon cloud, Grossman says. "If I use Eucalyptus in-house and engineer the application correctly, I can get surge capacity form Amazon," he says.

Moving workloads from one public cloud to another is more difficult because that requires standardized management tools, he says. Exchanging data between clouds is another problem. Just as TCP allows the bridging of two networks, Grossman says he'd like to see an inter-cloud protocol allowing multiple clouds to exchange information.

"People are still trying to their hands around some of the issues," Grossman says. "What we have is a very young, very vibrant, rapidly moving industry that is still sort of sorting itself out," Grossman says. "I think things are trending in the right direction."

Microsoft's looming Windows 7 licensing 'disaster' for XP users

Windows 7, due to ship on Oct. 22, has gotten good reviews as the OS that Vista should have been. And the large percentage of businesses that have held onto XP rather than go to Vista - about half, according to Gartner - are no doubt planning to migrate to Windows 7. But Microsoft may be making it harder and costlier for them to do so, notes Gartner analyst Michael Silver. "It's a disaster waiting to happen," he says.

Microsoft's potential XP downgrade trapUnder Microsoft's planned enterprise licensing rules, businesses that buy PCs before April 23, 2010, with Windows 7 preinstalled can downgrade them to Windows XP, then later upgrade them to Windows 7 when they're ready to migrate their users. But PCs bought on or after April 23 can only be downgraded to Vista - which is of no help for XP-based organizations, Silver notes - and could cause major headaches and add more costs to the Windows 7 migration effort.

[ See why InfoWorld's Randall C. Kennedy says XP mode is the right idea but the wrong implementation. | Preparing for Windows 7? Get the overview you need in the Windows 7 PDF Quick Guide from InfoWorld's J. Peter Bruzzese. | And download our free Windows performance-monitoring tool. ]

Microsoft's PR firm tells InfoWorld, "It looks like Microsoft hasn't made any announcements around timing for downgrade rights from Windows 7 to Windows XP yet." But Microsoft has discussed the six-month limit with Silver multiple times and characterized it him as a "pubic" policy. The policy is also clearly visible in a Microsoft PowerPoint slide (available for viewing at InfoWorld).

Both Forrester Research and Gartner advise clients to wait 12 to 18 months after Windows 7 ships before adopting the new OS, so they can test compatibility of their hardware and software, as well as ensure their vendors' Windows 7 support meets their needs. But Microsoft's six-month downgrade restriction for XP means that the businesses that chose not to install Vista have to rush the migration process. Or they can spend extra money and enroll in Microsoft's Software Assurance program, which then lets them install any OS version at the price of the extra yearly fee (about $90) per PC. "Microsoft will probably get more money out of [this policy]," Silver says.

For businesses not willing to pay extra for the Software Assurance program, Silver sees real headaches coming, which ironically could slow the adoption of Windows 7 by XP-based businesses. Organizations could buy more PCs than needed by April 22 to essentially stock up on XP-downgradable Windows licenses, but that distorts their purchasing costs. Or they could buy PCs as needed after April 23 and either live with Vista or Windows 7 on them - perhaps allocating those systems as test units instead of regular production systems - or buy XP licenses from retailers that still have them in stock. Tracking which PCs have which downgrade rights in IT asset management systems, though, "will be difficult," Silver notes. "Microsoft has made a real mess."

"Users need to say this policy doesn't make sense," Silver advises, and try to convince Microsoft to change it. Consumer pressure has worked to sway Microsoft licensing policies before; notably, businesses' strong resistance to Vista caused Microsoft to extend the availability of Windows XP several times in various forms.

The perils of using XP modeSilver notes that Microsoft is sending mixed signals to XP-based users, given that it will include a license for XP as part of Windows 7 Ultimate in what is called XP mode. In XP mode, a virtual machine can run Windows XP in parallel to Windows 7. But this approach doubles IT's workload, as it must deploy and manage two OSes per PC: Windows 7 and Windows XP. "That's not optimal," Silver says. And because many PCs can't run the Virtual PC technology that makes XP mode work, IT will face compatibility complications as well.

Silver suggests that XP mode will end up being used only for XP applications that can't run under Windows 7 (whether or not they're formally supported in Windows 7 by their vendors). But there may be more of those than IT realizes. The reason: Web apps tuned to Internet Explorer 6, which Microsoft has essentially orphaned. Windows 7 will ship with IE8, which has a compatibility mode for IE7, but not for IE6. And if IT retains IE7 in Windows 7, Silver notes that IE7 lacks an IE6 compatibility mode. So IT must rework its IE6-dependent Web apps or use XP mode to run IE6. Both are hassles.

Other questions on moving to Windows 7IT needs to work through several other issues when figuring out its Windows 7 migration strategy, Silver points out.

SAP first quarter earnings drop 16 percent

SAP reported first quarter net income down 16 percent year on year, and revenue down 3 percent, as customers remain reluctant to spend on new software.

Net income for the first quarter fell to €204 million (US$269 million as of March 31, the last day of the period reported) from €242 million a year earlier. SAP blamed the fall on a restructuring charge related to previously announced staff lay-offs.

Revenue fell to €2.40 billion from €2.46 billion a year earlier. Within that, software support revenue rose 18 percent to €1.25 billion, a rise somewhat offset by a fall in professional services revenue, down 9 percent to €649 million.

The biggest fall was in software sales, down 33 percent to €418 million. SAP blamed the decline on a difficult operating environment worldwide due to the global economic downturn.

It is unclear when buyers will regain confidence: "Visibility for software revenues remains limited," SAP said.

However, SAP's customers are continuing to buy software, but in smaller pieces, SAP co-CEO Leo Apotheker said during a conference call with financial analysts.

"It is obvious that in the current climate customers are trying to pinch every dollar, euro and yen before they spend it," he said.

SAP is therefore focusing on growing the overall volume of deals "in every theater and every market segment," said Bill McDermott, an SAP executive board member and president of global field operations.

The company is also looking forward to the upcoming general release of its Business Suite 7 application, as well as new BI (business intelligence) software, which will help drive revenues up, Apotheker said.

But the company declined to comment further on the outlook for the rest of the year, sticking to the same forecast it provided in January. Back then it made no predictions for future revenue, citing uncertainty about the business environment.

SAP may have slightly lower software support revenue in the coming years than it had previously hoped. Following pressure from users, announced Wednesday that it has capped the price of its new Enterprise Support program at 22 percent of the software license price until at least 2015. For existing users forced to migrate to that service from a cheaper existing service, the price rise will be spread over a longer period, limiting increases to 3.1 percent a year, rather than the previous 8 percent a year, SAP said.

But Enteprise Support "will become a competitive advantage" for the company, Apotheker predicted.

Along with its earnings report, SAP on Tuesday announced that it had agreed on a set of KPIs (key performance indicators) for Enterprise Support with SUGEN (SAP User Group Executive Network), a group composed of SAP user groups around the world.

No other vendor "comes even close" to providing the level of insight into support costs that the KPIs will afford, Apotheker said.

"I have to tell you that by having [created the KPIs], the discussion with our customers has radically changed. It is not about price anymore. Now people are focused on extracting the value," he said.

Meanwhile, SAP repeated its January prediction that operating margin will remain around 25 percent - if full-year software and software-related service revenues at constant currency remain flat or decline by 1 percent from their 2008 level of €8.62 billion. While companies tend to hedge their predictions against adverse moves in exchange rates by assuming constant currency, in the first quarter foreign exchange movements acted in SAP's favor. In the first quarter, software and software-related service revenue remained flat, but excluding Business Objects support revenue that Business Objects would have recognized had it remained a stand-alone entity it fell 2 percent - and would have fallen 4 percent at constant currency, SAP said.

The company stated its results according to U.S. generally accepted accounting principles (GAAP), but said that from the end of this year, it will only use International Financial Reporting Standards (IFRS) for external communications. It will also use IFRS figures for internal reporting, forecasting and incentive-based compensation plans for staff, it said. SAP began preparing financial reports according to both GAAP and IFRS in 2007, to comply with German and European law.

The only difference in reported revenue between the two accounting standards concerns SAP's now-closed third-party software maintenance subsidiary, TomorrowNow. SAP's U.S. GAAP income statement shows TomorrowNow's revenue and income separately because it is a discontinued operation, but IFRS does not allow this separation because TomorrowNow is not a material operation, SAP said.