Extreme Networks is readying a new family of networking modules, aimed at the wiring closet level, for its BlackDiamond 8800 chassis. The 8500 series joins existing modules that are aimed at other layers up to the data center. The new family of modules will enable businesses to run networking devices that touch on the wiring closet, core and data center levels from within the same chassis.
Extreme Networks is adding new modules to its BlackDiamond 8800 switching family that will enable the portfolio to cover all scenarios, from the wiring closet to the data center.
Extreme on Oct. 7 unveiled its 8500 series modules, which is aimed at the access edge and wiring closets. The modules join Extreme’s 8800 c-series modules for the aggregation/core level and the 8900 series for data centers as networking options that fit inside the 8800 enclosure.
“[The new 8500 series family] pretty much gives the 8800 a complete portfolio,” said Huy Nguyen, director of enterprise marketing at Extreme. “It allows the 8800 chassis to be deployed anywhere, from the wiring closet to the data center.”Resource Library:
That’s important, Nguyen said, because businesses tend to be reluctant to switch out enclosures.
“Once I have a chassis in my rack, it’s going to take a lot to remove it from my rack,” he said. “Businesses don’t want to do it.”
All the BlackDiamond modules use the same 8800 infrastructure, from power supplies to fan trays to accessories. A key to this ability to offer such a wide range of modules in a single enclosure is the company’s operating system, ExtremeOS, which gives all the modules a common base, Nguyen said.
This gives businesses the ability to upgrade their network modules as needed without having to overhaul their entire switching environment, he said. Unlike competitors such as Hewlett-Packard and Cisco Systems, the BlackDiamond 8800 also offers dual management modules at all levels, for greater redundancy, Nguyen said.
The three new modules—the 8500-MSM24, which is the management switch module; the 8500-G24X-e, a Gigabit fiber module; and 8500-G48T-e, a 48-port Gigabit Base-T module—offer automated service recovery for high resiliency, as well as automated configuration and security. They will all be available in the fourth quarter.
Tuesday, October 13, 2009
NetScout Gives Visibility into Virtualized Environments
NetScout is rolling out its nGenius Virtual Agent, which gives businesses greater visibility into the applications running within their virtualized environments. Virtualization has made it more difficult to monitor, analyze and troubleshoot applications, which can reside within virtual machines on a single physical system and can be moved between virtual machines. The new NetScout technology gives businesses real-time visibility into those environments.
NetScout Systems wants to give IT administrators greater visibility into their applications that are running in a virtualized environment.
NetScout Oct. 7 unveiled the nGenius Virtual Agent, software designed to enable IT professionals the same packet-flow monitoring and analysis capabilities that they get from using the vendor’s nGenius Probe in the physical world.
The rapid adoption of virtualization technology in the data center has enabled businesses to reduce costs and improve efficiencies, but it’s made it more difficult to track and analyze the performance of their applications on those virtual machines, according to NetScout officials.Resource Library:
Applications used to run on physical machines housed in racks and connected via networking cables, they said. Now a single physical system can be hosting several virtual machines, which in turn are running various applications.
Adding to the challenge is the fact that applications can be dynamically moved from one virtual machine to another, making it even more difficult to monitor and troubleshoot them.
“As enterprises continue to drive data center and server consolidation initiatives and increasingly leverage virtualization technologies, all the challenges associated with optimizing, protecting and troubleshooting applications apply to virtualized networks, but with exponentially more complexity and urgency,” Steven Shalita, vice president of marketing for NetScout, said in a statement.
The company’s nGenius Virtual Agent is integrated with its nGenius Service Assurance Solution, giving businesses real-time visibility into virtual environments. The nGenius Virtual Agent is installed as a virtual appliance on a VM, or in the case of a VMware virtualized environment, is connected to the hypervisor’s virtual switch.
The product offers visibility into all application activity within or across virtual machines, according to NetScout, helping businesses track, analyze and troubleshoot the applications.
Like NetScout’s Probe technology, the nGenius Virtual Agent can either grab packets on demand or according to pre-determined triggers. It also be used to mirror all virtual server traffic to an external nGenius InfiniStream appliance, for continuous monitoring.
The nGenius Virtual Agent software is available now for VMware ESX and ESXi deployments. The continuous iDPC (intelligent Deep Packet Capture) TAP mode will be supported later in the fourth quarter.
NetScout Systems wants to give IT administrators greater visibility into their applications that are running in a virtualized environment.
NetScout Oct. 7 unveiled the nGenius Virtual Agent, software designed to enable IT professionals the same packet-flow monitoring and analysis capabilities that they get from using the vendor’s nGenius Probe in the physical world.
The rapid adoption of virtualization technology in the data center has enabled businesses to reduce costs and improve efficiencies, but it’s made it more difficult to track and analyze the performance of their applications on those virtual machines, according to NetScout officials.Resource Library:
Applications used to run on physical machines housed in racks and connected via networking cables, they said. Now a single physical system can be hosting several virtual machines, which in turn are running various applications.
Adding to the challenge is the fact that applications can be dynamically moved from one virtual machine to another, making it even more difficult to monitor and troubleshoot them.
“As enterprises continue to drive data center and server consolidation initiatives and increasingly leverage virtualization technologies, all the challenges associated with optimizing, protecting and troubleshooting applications apply to virtualized networks, but with exponentially more complexity and urgency,” Steven Shalita, vice president of marketing for NetScout, said in a statement.
The company’s nGenius Virtual Agent is integrated with its nGenius Service Assurance Solution, giving businesses real-time visibility into virtual environments. The nGenius Virtual Agent is installed as a virtual appliance on a VM, or in the case of a VMware virtualized environment, is connected to the hypervisor’s virtual switch.
The product offers visibility into all application activity within or across virtual machines, according to NetScout, helping businesses track, analyze and troubleshoot the applications.
Like NetScout’s Probe technology, the nGenius Virtual Agent can either grab packets on demand or according to pre-determined triggers. It also be used to mirror all virtual server traffic to an external nGenius InfiniStream appliance, for continuous monitoring.
The nGenius Virtual Agent software is available now for VMware ESX and ESXi deployments. The continuous iDPC (intelligent Deep Packet Capture) TAP mode will be supported later in the fourth quarter.
Ellison: Oracle Not Interested in Brocade
Oracle CEO Larry Ellison reportedly told shareholders that Oracle is not interested in buying networking vendor Brocade, which is rumored to be for sale. Oracle and HP were among the companies reportedly interested in buying Brocade. IBM, Cisco and Dell are other names that have been mentioned as possible buyers.
Don’t look to Oracle to buy networking vendor Brocade Communications Systems.
Oracle CEO Larry Ellison reportedly said during the company’s annual shareholders meeting Oct. 7 that the software giant has no interest in acquiring Brocade. Ellison’s statement came in response to a question from a shareholder.
News organizations reported Oct. 5 that Brocade officials are shopping the company around, and that Hewlett-Packard and Oracle are among the frontrunners to buy it.Resource Library:
Since then, a number of other vendors, including Cisco Systems, Dell and IBM, have been mentioned as possible buyers. Until Ellison’s comment, none of the players involved—including Brocade officials—had commented publicly.
The issue comes at a time of rapid consolidation in the tech industry as larger vendors begin offering unified data center solutions, including servers, storage products, networking devices and management software.
Cisco entered the field earlier this year with the release of its UCS (Unified Computing System), and HP is more closely linking its ProLiant servers with its ProCurve networking business. IBM has expanded its partnerships with various networking companies, including Brocade, and Dell entered the fray with an expanded alliance with Brocade.
Oracle is in the process of buying Sun Microsystems for $7.4 billion, and Ellison and other officials have said they intend to keep Sun’s hardware business.
Brocade has expanded its business over the past year. The company already had a strong Fibre Channel portfolio and in 2008 bought Foundry Networks, which brought with it Ethernet capabilities.
In a report on the Brocade rumors, research firm The Info Pro said Brocade's business would be a good complement to HP, particularly around such emerging networking technologies as 10 Gigabit Ethernet and FCoE (Fibre Channel over Ethernet).
However, The Info Pro report said that Oracle buying Brocade only made sense if Oracle kept Sun’s hardware business.
Don’t look to Oracle to buy networking vendor Brocade Communications Systems.
Oracle CEO Larry Ellison reportedly said during the company’s annual shareholders meeting Oct. 7 that the software giant has no interest in acquiring Brocade. Ellison’s statement came in response to a question from a shareholder.
News organizations reported Oct. 5 that Brocade officials are shopping the company around, and that Hewlett-Packard and Oracle are among the frontrunners to buy it.Resource Library:
Since then, a number of other vendors, including Cisco Systems, Dell and IBM, have been mentioned as possible buyers. Until Ellison’s comment, none of the players involved—including Brocade officials—had commented publicly.
The issue comes at a time of rapid consolidation in the tech industry as larger vendors begin offering unified data center solutions, including servers, storage products, networking devices and management software.
Cisco entered the field earlier this year with the release of its UCS (Unified Computing System), and HP is more closely linking its ProLiant servers with its ProCurve networking business. IBM has expanded its partnerships with various networking companies, including Brocade, and Dell entered the fray with an expanded alliance with Brocade.
Oracle is in the process of buying Sun Microsystems for $7.4 billion, and Ellison and other officials have said they intend to keep Sun’s hardware business.
Brocade has expanded its business over the past year. The company already had a strong Fibre Channel portfolio and in 2008 bought Foundry Networks, which brought with it Ethernet capabilities.
In a report on the Brocade rumors, research firm The Info Pro said Brocade's business would be a good complement to HP, particularly around such emerging networking technologies as 10 Gigabit Ethernet and FCoE (Fibre Channel over Ethernet).
However, The Info Pro report said that Oracle buying Brocade only made sense if Oracle kept Sun’s hardware business.
Enterprises Understanding Need for Virtualization Management: CA
A recent study sponsored by CA and VMware backs up what officials there are hearing from customers--that management and automation tools are becoming more important as enterprises are moving their virtualization implementations from test and development to production environments. A CA official said a key finding in the study was that businesses are being more proactive in pursuing greater management capabilities for their virtualized environments.
When Stephen Elliot talks to IT executives about virtualization in the data center, he often finds himself in discussions about management, risk reduction, procedure and controls.
In short, executives who might be looking to virtualization for all the classic reasons—including reducing operating and capital costs—are also now keeping the management of those virtualized environments in the forefront of their minds, according to Elliot, vice president of strategy for CA’s Infrastructure Management and Automation business unit.
A recent study by the IT Process Institute, sponsored by CA and virtualization giant VMware, backs up what Elliot and other CA officials have been hearing.
“The key thing that pleasantly surprised us [in the study] is that customers right now … are thinking more proactively about the need to manage their virtual infrastructure,” Elliot said in an interview. “Just because they’ve got new innovations [in their data centers] doesn’t mean that their need for management just disappears.”Resource Library:
The study, which included data collected from 323 IT organizations in North America, outlined risks involved with server virtualization—from virtual sprawl and single points of failure to complexity and configuration, compliance and capacity issues.
Among the key findings of the study were that 72 percent of the participants in the study are virtualizing production servers, and that of those, 58 percent at one point had paused the implementation to improve operational procedures. In addition, 64 percent of participants said they are comfortable virtualizing business-critical servers.
The study also laid out recommendations for reducing those risks based on the level of maturity of an organization’s virtualization implementation.
This is increasingly important as businesses rapidly move beyond using virtualization in test and development environments and deploy virtualization technology in production arenas, Elliot said.
Shekar Ayyar, vice president of infrastructure alliances for VMware, agreed.
“The results of this study are consistent with the feedback we receive from our customers who tell us that strong management and automation tools are essential for maximizing the payoff from VMware virtualization,” Ayyar said in a statement.
For those enterprises using virtualization to consolidate servers and virtualize business-critical systems in production environments, the study noted 11 management practices, including host access, configuration and provisioning controls, and virtual machine provisioning.
For businesses at the next level, looking for high availability and disaster recovery capabilities via virtualization, management practices include configuration standardization, provisioning with approved build images, and using a “trust but verify” strategy for configuration compliance and change process, according to the study.
Finally, for organizations looking for greater dynamic resource management capabilities, the study recommends controls in such areas as configuration discovery, change approval and tracking, capacity and performance management, and support for greater automation.
The study can be found here.
Elliot said IT executives are understanding the need for greater management capabilities and are looking to top-tier vendors like CA, BMC Software and others for help.
“Customers are looking for confidence, for leadership in this area,” he said. “They’re engaged.”
When Stephen Elliot talks to IT executives about virtualization in the data center, he often finds himself in discussions about management, risk reduction, procedure and controls.
In short, executives who might be looking to virtualization for all the classic reasons—including reducing operating and capital costs—are also now keeping the management of those virtualized environments in the forefront of their minds, according to Elliot, vice president of strategy for CA’s Infrastructure Management and Automation business unit.
A recent study by the IT Process Institute, sponsored by CA and virtualization giant VMware, backs up what Elliot and other CA officials have been hearing.
“The key thing that pleasantly surprised us [in the study] is that customers right now … are thinking more proactively about the need to manage their virtual infrastructure,” Elliot said in an interview. “Just because they’ve got new innovations [in their data centers] doesn’t mean that their need for management just disappears.”Resource Library:
The study, which included data collected from 323 IT organizations in North America, outlined risks involved with server virtualization—from virtual sprawl and single points of failure to complexity and configuration, compliance and capacity issues.
Among the key findings of the study were that 72 percent of the participants in the study are virtualizing production servers, and that of those, 58 percent at one point had paused the implementation to improve operational procedures. In addition, 64 percent of participants said they are comfortable virtualizing business-critical servers.
The study also laid out recommendations for reducing those risks based on the level of maturity of an organization’s virtualization implementation.
This is increasingly important as businesses rapidly move beyond using virtualization in test and development environments and deploy virtualization technology in production arenas, Elliot said.
Shekar Ayyar, vice president of infrastructure alliances for VMware, agreed.
“The results of this study are consistent with the feedback we receive from our customers who tell us that strong management and automation tools are essential for maximizing the payoff from VMware virtualization,” Ayyar said in a statement.
For those enterprises using virtualization to consolidate servers and virtualize business-critical systems in production environments, the study noted 11 management practices, including host access, configuration and provisioning controls, and virtual machine provisioning.
For businesses at the next level, looking for high availability and disaster recovery capabilities via virtualization, management practices include configuration standardization, provisioning with approved build images, and using a “trust but verify” strategy for configuration compliance and change process, according to the study.
Finally, for organizations looking for greater dynamic resource management capabilities, the study recommends controls in such areas as configuration discovery, change approval and tracking, capacity and performance management, and support for greater automation.
The study can be found here.
Elliot said IT executives are understanding the need for greater management capabilities and are looking to top-tier vendors like CA, BMC Software and others for help.
“Customers are looking for confidence, for leadership in this area,” he said. “They’re engaged.”
Citrix Unveils XenDesktop 4 Virtualization Offering
Citrix is looking to tie together the disparate elements of desktop virtualization under XenDesktop 4, an offering that is designed to let businesses deliver any virtual desktop to any user through any model, including VDI and streaming. At the heart of XenDesktop 4 is Citrix’s FlexCast delivery technology, as well as the integration of Citrix’s XenApp application virtualization product.
Citrix is looking to bring together all the elements in the fractured and competitive desktop virtualization space under one umbrella.
That umbrella is XenDesktop 4.
Announced Oct. 6, XenDesktop 4 essentially comes with some 170 new features designed to enable businesses to deliver virtualized desktops to any user on any device through any model, including server-based, VDI (virtual desktop infrastructure) and streaming.
The offering supports not only Citrix’s XenServer virtualization technology, but also Microsoft’s Hyper-V and VMware’s ESX virtualization products, and VMware’s vSphere 4 platform. Company officials also said that integrating Citrix’s XenApp application virtualization technology into XenDesktop 4 will mean that application delivery is a part of the overall management scenario, improving costs.Resource Library:
The new offering, which will be generally available Nov. 16 and licensed on a per-user basis, comes as VMware and Microsoft look to bulk up their desktop virtualization capabilities, and as a host of established and smaller companies—such as Wyse Technology, MokaFive, Pano Logic, Wanova and RingCube—are offering their own desktop virtualization products.
At the same time, OEMs and analysts are anticipating an uptick in spending by enterprises on new PCs toward the end of this year and into 2010, thanks to the need to refresh aging fleets of PCs and the release later this month of Microsoft’s much anticipated Windows 7 operating system.
Citrix President and CEO Mark Templeton is calling 2010 a “watershed year for desktop virtualization.”
“Twenty-five years ago, the personal computer turned the world upside down, radically improving individual productivity and communications,” Templeton said in a statement. “That world is about to change again. People today need to work in entirely new ways, powered by the connectivity of the Internet, an explosion of new devices and the limitless promise of the Web.”
All that needs to be done without being tied down to traditional desktops, and desktop virtualization can do all that, he said.
The key to XenDesktop 4 is what Citrix officials call their FlexCast delivery technology, which gives businesses the ability to deliver any type of virtual desktop to any user on any device.
For example, XenDesktop 4 can offer a server-based virtual desktop for workers who share the same applications, while at the same time giving office workers who need more customization a hosted VDI option.
Blade PCs can deliver virtualized desktops to power users running high-end applications, while XenDesktop 4 can stream desktops to other users.
Citrix also has enhanced its HDX high-definition technology, which gives users of XenDesktop 4 a full desktop experience regardless of the device they’re using. The upgrades are aimed at multimedia content, real-time collaboration, 3D graphics and USB peripherals.
Dell is incorporating XenDesktop 4 into its larger Flexible Computing strategy, and Citrix officials talked about their expanded relationship with Microsoft. The new Citrix offering leverages enhancements in Microsoft’s Windows Servers 2008 R2 Remote Desktop Services and VDI Suites. The two companies also are working to integrate Microsoft’s Application Virtualization technology into XenDesktop, and XenDesktop 4 is aimed to take advantage of features in Windows 7.
Citrix is looking to bring together all the elements in the fractured and competitive desktop virtualization space under one umbrella.
That umbrella is XenDesktop 4.
Announced Oct. 6, XenDesktop 4 essentially comes with some 170 new features designed to enable businesses to deliver virtualized desktops to any user on any device through any model, including server-based, VDI (virtual desktop infrastructure) and streaming.
The offering supports not only Citrix’s XenServer virtualization technology, but also Microsoft’s Hyper-V and VMware’s ESX virtualization products, and VMware’s vSphere 4 platform. Company officials also said that integrating Citrix’s XenApp application virtualization technology into XenDesktop 4 will mean that application delivery is a part of the overall management scenario, improving costs.Resource Library:
The new offering, which will be generally available Nov. 16 and licensed on a per-user basis, comes as VMware and Microsoft look to bulk up their desktop virtualization capabilities, and as a host of established and smaller companies—such as Wyse Technology, MokaFive, Pano Logic, Wanova and RingCube—are offering their own desktop virtualization products.
At the same time, OEMs and analysts are anticipating an uptick in spending by enterprises on new PCs toward the end of this year and into 2010, thanks to the need to refresh aging fleets of PCs and the release later this month of Microsoft’s much anticipated Windows 7 operating system.
Citrix President and CEO Mark Templeton is calling 2010 a “watershed year for desktop virtualization.”
“Twenty-five years ago, the personal computer turned the world upside down, radically improving individual productivity and communications,” Templeton said in a statement. “That world is about to change again. People today need to work in entirely new ways, powered by the connectivity of the Internet, an explosion of new devices and the limitless promise of the Web.”
All that needs to be done without being tied down to traditional desktops, and desktop virtualization can do all that, he said.
The key to XenDesktop 4 is what Citrix officials call their FlexCast delivery technology, which gives businesses the ability to deliver any type of virtual desktop to any user on any device.
For example, XenDesktop 4 can offer a server-based virtual desktop for workers who share the same applications, while at the same time giving office workers who need more customization a hosted VDI option.
Blade PCs can deliver virtualized desktops to power users running high-end applications, while XenDesktop 4 can stream desktops to other users.
Citrix also has enhanced its HDX high-definition technology, which gives users of XenDesktop 4 a full desktop experience regardless of the device they’re using. The upgrades are aimed at multimedia content, real-time collaboration, 3D graphics and USB peripherals.
Dell is incorporating XenDesktop 4 into its larger Flexible Computing strategy, and Citrix officials talked about their expanded relationship with Microsoft. The new Citrix offering leverages enhancements in Microsoft’s Windows Servers 2008 R2 Remote Desktop Services and VDI Suites. The two companies also are working to integrate Microsoft’s Application Virtualization technology into XenDesktop, and XenDesktop 4 is aimed to take advantage of features in Windows 7.
VMware Fusion 3 Offers ‘Snow Leopard,` Windows 7 Support
VMware is planning to release its updated Fusion 3 virtualization software for Apple’s Mac and the Mac OS X. The latest version of VMware Fusion offers support for Apple’s recently released Snow Leopard and supports the upcoming Microsoft Windows 7. Fusion 3 also supports graphics such as OpenGL 2.1 and DirectX 9.0c.
VMware is preparing to release the latest version of its Fusion virtualization software for the Mac, which now includes support for Apple’s recently released Mac OS X 10.6 “Snow Leopard” and Microsoft’s soon-to-be-released Windows 7 operating system.
On Oct. 6, VMware announced it would start selling Fusion 3 on Oct. 27. The virtualization software is already available for pre-order through VMware’s own Website, the Apple Online Store and Amazon.com for $79.99. Those users looking to upgrade from previous versions of Fusion can purchase the software for $39.99.
Right now, there are a handful of companies that offer desktop virtualization software that allows Mac users to run Windows within a separate virtualized partition. In addition to VMware, Parallels sells its own Desktop 4.0 Switch to Mac software. Parallels recently updated its offering in August.Resource Library:
For a look at Apple’s Snow Leopard, please click here.
Apple also sells its own portioning software – Boot Camp – which is available from the Apple Online Store and Apple Website.
VMware is offering more than 50 new features with the release of Fusion 3. The most important is support for Apple’s Snow Leopard. Fusion 3 uses a new, 64-bit core virtualization engine and offers native support for the 64-bit kernel, which VMware claims offers better Windows performance on the Mac.
In addition, VMware Fusion 3 supports Windows 7, which is slated for release on Oct. 22, as well as the Windows Aero user interface and the Flip 3D navigation feature. For gamers and PC enthusiasts, the Fusion 3 software offers support for both OpenGL 2.1 and Microsoft’s DirectX9.0c Shader Model 3 graphics, which allows users to run Windows games and other applications on the Mac.
Finally, VMware Fusion 3 offers a new type of migration tool that can move a snapshot of their PC to a virtual machine within the Mac. The migration tool uses an installer and a four-digit code that allows the migration to happen automatically across the network. The Website Ars Technica has a comprehensive rundown on the how the PC-to-Mac migration technology works.
VMware is preparing to release the latest version of its Fusion virtualization software for the Mac, which now includes support for Apple’s recently released Mac OS X 10.6 “Snow Leopard” and Microsoft’s soon-to-be-released Windows 7 operating system.
On Oct. 6, VMware announced it would start selling Fusion 3 on Oct. 27. The virtualization software is already available for pre-order through VMware’s own Website, the Apple Online Store and Amazon.com for $79.99. Those users looking to upgrade from previous versions of Fusion can purchase the software for $39.99.
Right now, there are a handful of companies that offer desktop virtualization software that allows Mac users to run Windows within a separate virtualized partition. In addition to VMware, Parallels sells its own Desktop 4.0 Switch to Mac software. Parallels recently updated its offering in August.Resource Library:
For a look at Apple’s Snow Leopard, please click here.
Apple also sells its own portioning software – Boot Camp – which is available from the Apple Online Store and Apple Website.
VMware is offering more than 50 new features with the release of Fusion 3. The most important is support for Apple’s Snow Leopard. Fusion 3 uses a new, 64-bit core virtualization engine and offers native support for the 64-bit kernel, which VMware claims offers better Windows performance on the Mac.
In addition, VMware Fusion 3 supports Windows 7, which is slated for release on Oct. 22, as well as the Windows Aero user interface and the Flip 3D navigation feature. For gamers and PC enthusiasts, the Fusion 3 software offers support for both OpenGL 2.1 and Microsoft’s DirectX9.0c Shader Model 3 graphics, which allows users to run Windows games and other applications on the Mac.
Finally, VMware Fusion 3 offers a new type of migration tool that can move a snapshot of their PC to a virtual machine within the Mac. The migration tool uses an installer and a four-digit code that allows the migration to happen automatically across the network. The Website Ars Technica has a comprehensive rundown on the how the PC-to-Mac migration technology works.
Pano Logic Rolls Out Latest Desktop Virtualization Offering
A day after Citrix and VMware showed off new capabilities in their desktop virtualization products, Pano Logic is announcing Version 2.8 of its Pano System. Pano Logic’s desktop virtualization offering centralizes all the processing, memory and software on a central server, and leaves only a zero-client Pano Device on the employee desktop. Pano System 2.8 enhances scalability and management, and includes the ability for users to connect the Pano Device to two monitors.
Pano Logic is releasing the latest version of its desktop virtualization technology into a highly competitive market that has seen its share of vendor moves recently.
A day after Citrix Systems unveiled its XenDesktop 4 technology and VMware announced that its Fusion product now supports Apple’s “Snow Leopard” and Microsoft’s Windows 7 operating systems, Pano Logic is rolling out Pano System 2.8 and Pano Dual Monitor, which officials said will improve the end-user experience as well as the scalability and manageability of their offering.
The Pano Logic system includes a Pano Device, which officials call a zero-client endpoint. The Pano Device has no processor inside, instead putting all of the computing power on a back-end server.
The endpoint device is connected via existing IP networks to an instance of Microsoft’s Windows operating system, which is virtualized on the centralized server. The system is controlled by Pano Manager software.Resource Library:
Pano Logic is bucking a trend in desktop virtualization, according to Parmeet Chaddha, executive vice president of engineering at the company. It centralizes all of the key elements—from processing power and memory to the OS and drivers—while other VDI (virtual desktop infrastructure) options, such as thin clients, continue to add features, such as more processing power, to the endpoint devices.
“What you’re seeing is not thin clients, but thicker clients,” Chaddha said in an interview.
The results include bigger management headaches and greater security concerns, he said. With Pano Logic’s system, everything is centrally contained within a back-end server.
“Zero-client is as dumb as it gets,” he said. “There’s nothing IT needs to do on the endpoint. There’s no data on it. … It’s truly 100 percent centralization.”
Businesses can save 70 percent in total cost of ownership over traditional PCs, and the Pano Device uses less than 3 watts of power, about 97 percent less than a typical desktop, saving on power and cooling costs, officials said.
With Pano System 2.8, announced Oct. 7, the company is doubling the number—to about 1,000—of desktops that can be controlled from a single Pano Logic management console, Chaddha said. It also enhances the backup and restore capabilities, and now offers Pano Dual Monitor, a USB adapter that enables two display devices to be connected to the Pano Device, so users can get multidisplay capabilities for applications and Windows OS.
Pano System 2.8 is available immediately, starting at $319 per desktop.
Pano Logic is releasing the latest version of its desktop virtualization technology into a highly competitive market that has seen its share of vendor moves recently.
A day after Citrix Systems unveiled its XenDesktop 4 technology and VMware announced that its Fusion product now supports Apple’s “Snow Leopard” and Microsoft’s Windows 7 operating systems, Pano Logic is rolling out Pano System 2.8 and Pano Dual Monitor, which officials said will improve the end-user experience as well as the scalability and manageability of their offering.
The Pano Logic system includes a Pano Device, which officials call a zero-client endpoint. The Pano Device has no processor inside, instead putting all of the computing power on a back-end server.
The endpoint device is connected via existing IP networks to an instance of Microsoft’s Windows operating system, which is virtualized on the centralized server. The system is controlled by Pano Manager software.Resource Library:
Pano Logic is bucking a trend in desktop virtualization, according to Parmeet Chaddha, executive vice president of engineering at the company. It centralizes all of the key elements—from processing power and memory to the OS and drivers—while other VDI (virtual desktop infrastructure) options, such as thin clients, continue to add features, such as more processing power, to the endpoint devices.
“What you’re seeing is not thin clients, but thicker clients,” Chaddha said in an interview.
The results include bigger management headaches and greater security concerns, he said. With Pano Logic’s system, everything is centrally contained within a back-end server.
“Zero-client is as dumb as it gets,” he said. “There’s nothing IT needs to do on the endpoint. There’s no data on it. … It’s truly 100 percent centralization.”
Businesses can save 70 percent in total cost of ownership over traditional PCs, and the Pano Device uses less than 3 watts of power, about 97 percent less than a typical desktop, saving on power and cooling costs, officials said.
With Pano System 2.8, announced Oct. 7, the company is doubling the number—to about 1,000—of desktops that can be controlled from a single Pano Logic management console, Chaddha said. It also enhances the backup and restore capabilities, and now offers Pano Dual Monitor, a USB adapter that enables two display devices to be connected to the Pano Device, so users can get multidisplay capabilities for applications and Windows OS.
Pano System 2.8 is available immediately, starting at $319 per desktop.
10 Things You Need to Know Now About Data Deduplication
The value of data deduplication storage technology that is good for everything inside-and even outside-an IT system cannot be overstated. As it eliminates redundant data from disk storage devices, it lowers storage space requirements, which in turn lowers data center power and cooling costs and lessens the amount of carbon dioxide produced to generate power to run the hardware. There is nothing bad about dedupe; no wonder it is the most asked-for feature in new storage system purchases.
Several companies have been providing this for several years, and FalconStor-mostly known as an OEM "dedupe" partner for EMC, Sun, IBM, Acer, Pillar Data Systems, 3PAR, Isilion, and several others—has been been a busy producer. FalconStor's brand of dedupe works cross-platform, and users say it is fast and efficient. Still, there is much to be learned by potential buyers about how it works and what benefits it brings. To this end, FalconStor Director of Marketing Fadi Albatal offers a list of key facts about dedupe that an IT manager should know before the buy is made.
Several companies have been providing this for several years, and FalconStor-mostly known as an OEM "dedupe" partner for EMC, Sun, IBM, Acer, Pillar Data Systems, 3PAR, Isilion, and several others—has been been a busy producer. FalconStor's brand of dedupe works cross-platform, and users say it is fast and efficient. Still, there is much to be learned by potential buyers about how it works and what benefits it brings. To this end, FalconStor Director of Marketing Fadi Albatal offers a list of key facts about dedupe that an IT manager should know before the buy is made.
Subscribe to:
Comments (Atom)