Leaked: First Office 14 screenshots

February 7, 2009

It didn’t take long this time, either. Earlier this week, testers received alpha builds of Office 14, the codename for the successor to Office 2007. A reader wrote in to tell us that a tester from Russian site wzor.net has now leaked the screenshots of the applications included in the productivity suite.

While he didn’t screenshot every application individually, we do now know that the list includes: Access 14, Excel 14, Groove 14, InfoPath Designer 14, InfoPath Filler 14, InterConnect 14, OneNote 14, Outlook 14, PowerPoint 14, Project 14, Publisher 14, SharePoint Designer 14, Visio 14, and Word 14. Without further ado, here they are:

I’ve been told that the beta of Office 14 will start in May, and the final version is slated for the end of the year. The alpha is reportedly running with good stability, all things considered. Redmond is expected to provide official release details for Office 14 next quarter.

Advertisements

Holy! India Brings Out the $10 Laptop

February 4, 2009

Just saw this bit of news on Rediff that India has unveiled its much talked and blogged about $10 Laptop, and apparently the ministry of HRD has kept its promise with this price. After Ratan Tata laid claim to the cheapest car ever with the Nano last year India has its hands on the cheapest laptop if this story remains confirmed.

India first announced the plan to make these ultra cheap laptops in the summer of 2007 almost two years back, and had said it would take around 2 to 3 years before it can come out with the product. With this launch we have even stuck with the time lines surprisingly. The laptop is expected to hit the market in 6 months time.

The $10 laptop has always seemed a very unlikely aspiration on part of the Indian government and was only seen as an ego battle against Nicholas Negroponte’s $100 laptop part of the OLPC programme which India had refused to be part of. The announcement was met with a lot of skepticism in the blogosphere and there were contradicting reports that the the government actually meant $100 instead. However, laying aside all the rumours the project has finally been unveiled and the price is expected to be anywhere between Rs.500 to 1,000.

“The mission was launched at a huge gathering of academicians and the officials from across the country including thirty vice chancellors of central and state universities at the campus of Sri Venkateshwara University Tirupati.” This is a rediff quote, the only thing that sounds amiss here is that if it was such a huge gathering why aren’t there any other report on this matter yet? Anyway that is just a curiosity more than an acquistion.

S K Sinha, joint secretary in the ministry for education, gave a demo to this packed audience though as of now we have no report on how the product looks or any concrete specs of the laptop. The laptop has 2 GB onboard memory with wireless Internet connectivity.

It seems a little too good to be true even now, and I am waiting eagerly to know what this thing looks like. If it holds on to its quality like the Nano did then it will most certainly be a proud moment for a nation whose contribution to computing began with a simple Zero. Fingers Crossed.


Transmitting data 16 times faster @ 640 billion bits per second

February 3, 2009

Every second, millions of phone calls and cable TV shows are dispatched through fibres as digital zeros and ones formed by chopping laser pulses into bits.

This slicing and dicing is generally done with an electro-optic modulator, a device for allowing an electric signal to switch a laser beam on and off at high speeds. Reading that fast data stream with a compact and reliable receiver is another matter.

A new error-free speed-reading record using a compact ultra-fast component – 640 gigabites (billion) per second or Gbps – has now been established jointly by scientists from Denmark and Australia.

New technology and new ways of doing business require new approaches to old procedures. Conventional readers of optical data depend on photo-detectors, electronic devices that can operate up to approximately 40 Gbps.

This in itself represents a great feat of rapid reading, but it’s not good enough for the higher-rate data streams being designed now. Sometimes to speed up data transmission several signals are multiplexed: each, with its own stream of coded data, is sent down an optical fibre at the same time.

In other words, 10 parallel streams of data could each be sent at a rate of 10 Gbps and then added up to an effective stream of 100 Gbps. At the receiving end the parallel signals have to be read out in a complementary de-multiplexing process.

Reliable and fast multiplexing and de-multiplexing represent a major bottleneck in linking up the electronic and photonic worlds.

In 1998 researchers in Japan created a data stream as high as 640 Gbps and were able to read it back, but the read-out apparatus relied on long lengths of special optical fibre. This particular approach is somewhat unstable.

The new de-multiplexing device demonstrated at the Technical University of Denmark, by contrast, can handle the high data rate, and can do so in a stable manner.

Furthermore, instead of 50-metre-long fibres, they accomplish their de-multiplexing of the data stream with a waveguide only five cm long, an innovation developed at the Centre for Ultrahigh Bandwidth Devices for Optical Systems, or CUDOS, in Australia.

Another benefit of the new device with the compact size is the potential for integration with other components to create more advanced ultra-fast functional chips. The dynamics involved in the CUDOS device could even allow for still higher data rates approaching terabits/second (Tbps, or trillion bits per second), said a CUDOS release.

Danish scientist Leif K. Oxenløwe, study co-author said that the record speeds of de-multiplexing represented by his tiny glass microchip is a boon to circuit designers and opens the door to faster network speeds. In the near future, the Danish and Australian researchers hope to achieve 1 Tbps Ethernet capability.

These findings were published in Optics Express, the Optical Society’s (OSA) open-access journal.


Google Earth 5.0 goes under the sea, back in time

February 3, 2009

Google announced a new version of Google Earth today with features that focus on what is under our ocean, in our past, and above our heads. Ars Technica did some vicarious adventuring to check out the new features.

Google Earth 5.0 (beta) is available for Mac OS X, Windows, and Linux users, though we should note that Google has become even more aggressive with the installation of its software update mechanism. Instead of covertly installing the software like it has in the past or offering the option to disable said updater like it should, Google now presents a dialog that forces the user to agree to the software license and installation of a phantom software update tool that cannot be uninstalled if the user wants to run Google Earth. But let’s not dwell on the negatives, because there is a lot to love about this new version.

Initial gripes aside, Google Earth 5.0 exhibits yet more UI refinements and polish that bring it more in line with Google’s other desktop software. One of the most interesting features of this release is the introduction of an interactive ocean. While Google Earth has featured large blue bodies of water for some time and basic, 2D topological details, version 5.0 allows users to dive below the surface and explore a 3D, bathymetric map of much of the ocean’s floor. Users can simply keep zooming into the world’s oceans and many large seas, and wherever depth details begin appearing, continue to zoom in past the ocean surface and orient one’s view to start swimming.

Of course, introducing an entirely new way to view two-thirds of the world’s surface would not be complete without some actual data points to make all that space interesting. Google added a new ocean-centric layer of toggleable information to Google Earth, including content from National Geographic, Cousteau Ocean World, shipwrecks, animal tracking, and more. While you cannot dive to actually visit and zoom around underwater landmarks like shipwrecks, Google does provide a wealth of embedded content from sites like Wikipedia, National Geographic, and YouTube for many significant locations.


VMware Fusion 2 now available

September 28, 2008

Around the same time that VMware was embracing the Virtual Datacenter OS and the “internal cloud,” it was also launching the latest version of its Mac desktop virtualization product, Fusion 2.0.

“VMware Fusion 2 makes it easy and fun for every Mac user to run the Windows applications they need while enjoying the Mac experience they want,” said Pat Lee, group manager for consumer products, VMware. “Our goal is to break down the walls between Windows and the Mac by creating a user-friendly, Mac-native experience that lets our customers run any Windows application, seamlessly and safely, on the Mac. We want our customers to see that Windows really is better on the Mac.”

According to VMware, Fusion 2 adds more than 100 new features and enhancements to the product, and they are claiming to deliver the most advanced Mac virtualization software available today. Today, perhaps, because virtualization platform vendor Parallels is planning to release its Desktop for Mac 4 product soon enough. While the heat between these two companies in the Mac space has cooled down somewhat during the recent months, things could be heating up again real soon.

Among the changes in Fusion 2 is a new take on protection. In addition to being able to take multiple snapshots in any number of states, VMware has also added “AutoProtect” which automatically records snapshots of running VMs at regular intervals. They’ve also added virus protection. OK, so it isn’t some cool, new cyber agent type thing, it’s a 12-month subscription to McAfee VirusScan Plus. Still, not bad.

As far as display goes, if you happen to have 10 monitors lying around, guess what? You now have the ability to run applications on all 10 displays with Fusion’s multiple monitor support. Graphics have also been enhanced. VMware has added new 3D graphics support and compatibility with DirectX 9.0c and Shader Model 2 for software and games.

VMware claims that Fusion 2 supports an impressive list of more than 90 operating systems, including Windows Vista and Windows XP. (Don’t forget, you still need to purchase separate operating system licenses.) This is one of the advantages that VMware has over competitors. And now, they also offer experimental support for Mac OS X Server 10.5 (Keep in mind this does not include the standard edition of the OS). In addition, users can also now operate virtual machines with up to four virtual CPUs (Remember, the guest operating system will need to support that number of processors as well).

Feature favorite Unity is still around, breaking down the walls between Windows and Mac OS X, transforming Windows applications to work seamlessly within OS X like native applications. Users can launch any Mac file with any Windows application, seamlessly share data and folders between Windows and Mac, and even custom map the Mac keyboard to special keystrokes for Windows applications.

VMware Fusion 2 is a free, downloadable upgrade for all VMware Fusion 1.x customers. So what if you don’t have Fusion 1.x? Well, you can buy it outright at retail for $79.99. But if you own a competitor product and you feel the need to switch, you can grab a $30 rebate offer until the end of the year.


Microsoft amassing high-performance server software attack

September 28, 2008

Microsoft has built a strategy around the planned early-November release of its high-performance computing server that it hopes will be the catalyst to deliver massive computing power for future applications.

The strategy encompasses Microsoft applying its typical mantra of “simplifying computing” to the costly and often complex high-performance computing world in the form of its Windows HPC Server 2008 surrounded by Microsoft’s collection of applications, management wares, development tools, and independent software vendor community.

“We are not talking about a lot of unique product development here; it is mostly about packaging and coming up with appropriate licensing,” says Gordon Haff, an analyst with Illuminata. “But as HPC becomes more and more mainstream and used for all kinds of commercial roles, whether it is product design or business analytics, Windows is not such an unnatural fit as it might have been in the past.”

Microsoft last week said it would release on Nov. 1 HPC Server 2008, the company’s most competent move to date to offer parallel computing horsepower to corporations doing more real-time simulations, designs, and number crunching.

But the road is decidedly uphill.

Microsoft currently lays claim to less than 5 percent of HPC server market revenue, according to IDC. Those numbers compare with 74 percent for Linux and just more than 21 percent for Unix variants.

In addition, competitors such as Red Hat have been offering its Enterprise Linux for HPC Compute Nodes since last year. And Sun late last year reentered the HPC fray with its Constellation System.

Those sorts of challenges, however, have not deterred Microsoft in the past.

The company is betting users such as engineers will combine workflows running on their Windows workstations with Windows-based back-end HPC clusters, or move those workloads off the desktop and into an HPC infrastructure.

Microsoft also envisions such desktop /back-end combinations as Excel users performing a function call from their desktop, which in the background executes an agent that runs some computational algorithms on a networked HPC cluster and returns an answer. The user would have no concept of the back-end tied to Excel, which is widely used in financial services.

Since the 2006 release of Windows Compute Cluster Server 2003, Microsoft has been working with partners such as HP and Intel to create mass market appeal for HPC and the message may finally be striking a chord as prices drop and performance rises on technical computing platforms.

But Microsoft, experts say, isn’t likely to climb the ladder and replace high-end HPC environments built on Linux and Unix.

The real opportunity is appealing to new buyers with a Windows desktop infrastructure looking anew at HPC for workgroups or departments.

IDC says HPC hardware revenue 2007 alone generated by workgroup and departmental platforms was nearly $5.5 billion, just more than half of the $10 billion total. The prices on platforms in those segments range from $100,000 and below (workgroup) to $100,000 to $250,000 (departmental).
Microsoft’s recent hardware-software partnership with Cray on the CX1 “personal” supercomputer aimed at financial services, aerospace, automotive, academia, and life sciences and priced at $25,000 is testament to Microsoft’s plan — as is the $475 per node price of HPC Server 2008.

That’s not to say Microsoft won’t make a run for the top. Earlier this year, a Windows Server 2008 HPC cluster built by the National Center for Supercomputing Applications garnered a No. 23 ranking on the list of the world’s top 500 largest supercomputers, achieving 68.5 teraflops and 77.7 percent efficiency on 9,472 cores.

But experts say Microsoft’s sweet spot will be much lower down the list.

“The Microsoft strategy is aiming hardest at verticals where Windows is strong on the desktop and then extending that Windows environment upward,” says Steve Conway, research vice president for technical computing at IDC. “It includes applications such as Excel and tools like Visual Studio so people can unify their desktop and server workflow.”

Microsoft also plans to integrate HPC Server with its System Center tools for application-level monitoring and rapid provisioning by releasing an HPC Management Pack for System Center Operations Manager by year-end, according to Ryan Waite, product unit manager for HPC Server 2008.

The company is aligning HPC Server 2008 with Visual Studio Team Services, and F#, a development language, designed to help write new applications and rewrite old ones for parallel computing environments.

“We are looking at the holistic system,” says Vince Mendillo, director of HPC in the server and tools division at Microsoft.

Familiarity is the big theme. Windows HPC Server 2008 is built on the 64-bit edition of Windows Server 2008.

The platform combines into a single package the operating system with a message passing interface and a job scheduler built by Microsoft.

The server software, built to scale to thousands of cores, also includes a high-speed NetworkDirect RDMA, Microsoft’s new remote direct memory access interface, and cluster interoperability through standards such as the High Performance Computing Basic Profile specification produced by the Open Grid Forum. The server features high-speed networking, cluster management tools, advanced failover capabilities and support for third-party clustered file systems.

“HPC is no longer a niche either in terms of hardware platform or in terms of pervasiveness,” Illuminata’s Haff says. “For the most part, it is using volume hardware and is being applied to all kinds of problems in all kinds of companies and organizations.”

It is that trend that Microsoft is betting on.

“We can take people’s apps on Windows workstations and automatically scale those apps with supercomputer capabilities on the back end,” Microsoft’s Waite says. “When you pull all those pieces together in an integrated fashion, HPC becomes easier to use.”


Adobe Gives Creative Suite 4 More Flash

September 26, 2008

Collaboration and integration are two key focuses of Adobe’s latest software rollout, Creative Suite 4. The integration of Flash technology throughout the application suite is one of the major components of the upgrade.


Adobe’s (Nasdaq: ADBE) Latest News about Adobe long-awaited Creative Suite 4 has made its public debut. As the company has demonstrated with previous releases of Creative, as well as other products, it is advancing a Web 2.0 agenda. For example, Adobe has integrated Flash throughout Creative Suite 4 to facilitate collaboration among designers and developers as they craft digital work products.

Toward that end, Adobe has also realigned workflow enhancements — changes that were on display for a short time earlier this year, when the Adobe Labs site posted publicly available betas of Dreamweaver, Fireworks and Soundbooth.
Flash Integration

Integration of Creative 4’s many moving parts was a fundamental goal of its developers, Adam Pratt, senior solutions engineer, told TechNewsWorld.

“It is important that each application does a great job for its particular function, but users don’t work in isolation anymore — so integration across the Web, print and video workflows is more important than ever before,” he explained.

Citing the wholesale integration of Flash technology, Pratt noted that prior to Creative 4, “there had been little samples here and there, such as with video. But nothing like this.”

The new suite — Adobe’s largest software rollout to date — includes Adobe Creative Suite 4 Design editions, Creative Suite 4 Web editions, Creative Suite 4 Production Premium and Creative Suite 4 Master Collection, as well as 13 point products, 14 integrated technologies and seven services.

Customers can choose from six suites or full version upgrades of 13 stand-alone applications, including Photoshop CS4, Photoshop CS4 Extended, InDesign CS4, Illustrator CS4, Flash CS4 Professional, Dreamweaver CS4, After Effects CS4, and Adobe Premiere Pro CS4.
Enhanced Features

Changes to the functionality include a simplified workflow that enables users to complete tasks and switch between mediums without leaving a project.

InDesign CS4 provides a new Live Preflight tool that allows designers to catch production errors; a newly customizable links panel helps them place files more efficiently.

The Content-Aware Scaling tool in Photoshop CS4 and Photoshop CS4 Extended automatically recomposes an image as it is resized, preserving areas as it adapts to new dimensions.

An expanded version of Dynamic Link in CS4 Production Premium allows users to move content between After Effects CS4, Adobe Premiere Pro CS4, Soundbooth CS4 and Encore CS4, so the updates can be seen without rendering.

3-D functionality is also a focus in Adobe Creative Suite 4, with users able to paint, composite and animate 3-D models using pre-existing tools.

In Flash CS4 Professional, users can apply tweens to objects instead of keyframes. In Flash, the new Bones tool creates more realistic animations between linked objects.

With Adobe Device Central CS4, which has a library of more than 450 device profiles from manufacturers, users can test mobile content designed using many of the Creative Suite 4 products.

Adobe Creative Suite 4 also expands access to online collaborative services. Adobe ConnectNow, an Acrobat.com service that allows real-time collaboration with two other people, can be accessed from InDesign CS4, Illustrator CS4, Photoshop CS4 and Photoshop Extended CS4, Flash CS4 Professional, Dreamweaver CS4, Fireworks CS4, and Acrobat 9 Pro.

Designers can also share color harmonies with Adobe Kuler, which is now accessible from InDesign CS4, Illustrator CS4, Photoshop CS4 and Photoshop Extended CS4, Flash CS4 and Fireworks CS4.

Adobe Creative Suite 4 and its products will ship in October 2008. It will be available through Adobe authorized resellers and at the Adobe Store.

Adobe Creative Suite 4 Design Premium is expected to retail for US$1,799. Other price points include $1,699 for Adobe Creative Suite 4 Web Premium, $1,699 for Adobe Creative Suite 4 Production Premium, and $2,499 for Adobe Creative Suite 4 Master Collection.