Wednesday 9 September 2015

Which open source cloud infrastructure tool is right for me?

Choosing the wrong open source cloud infrastructure tool can take a toll on your IT environment. So which one is right for my business?

Open source or not, the cloud infrastructure tool you ultimately select will become critical to your data center environment. And once the tool is deployed, integrated, configured and in production, switching would cause a significant disruption in cloud services.
To choose the best open source cloud infrastructure tool for your business, evaluate potential tools upfront and select the best candidate from the start. Ask whether the tool does what you need it to do. Cloud infrastructure tools are complex and open source tools, in particular, may not be well-documented, which makes it tough to perform objective comparisons.
Here's a breakdown of three common open source cloud infrastructure tools to help you determine which makes the most sense for your cloud environment.
Examining three common open source cloud infrastructure tools
1.      Apache CloudStack is a multi-tenant Java tool that supports multiple hypervisors, including XenServer, KVM, Hyper-V and vSphere. It offers APIs for software integration and a web-based interface for cloud management. Additionally, CloudStack can:
·         Manage storage instances on hypervisors.
·         Orchestrate network services such as DHCP, NAT, firewalls and VPNs.
·         Offer reporting functionality for network, compute and storage resources.
·         Provide user management capabilities.
2.      OpenNebula provides a rich feature set that organizations can use to create fully functional clouds. Capabilities include:
·         Multi-tenant and highly secure operations.
·         On-demand provisioning and monitoring of compute, storage and network resources.
·         High availability
·         Distributed resource optimization for better workload performance.
·         Centralized management across multiple availability zones and interfaces for public clouds like Amazon Web Services.
·         Significant extensibility
3.      OpenStack is a comprehensive operating system for cloud environments. It's composed of separate compute, storage and networking modules built on a foundation of shared services such as identity, image handling and orchestration. OpenStack also includes a dashboard interface.
Open source tools like CloudStack, OpenNebula, OpenStack and others are intended to convert a virtualized data center into a private or public cloud. While the features above are general, the actual feature sets for each product are extremely large, so carefully assess and compare them before deployment.
In addition, examine each tool's roadmap and future development directions. The tool you choose today must be compatible with other tools, services, APIs and systems that you plan to deploy in the future. The extensibility and versatility of cloud infrastructure middleware helps prevent complete roadblocks. However, organizations should still test and document all product updates and configuration changes before rolling them out to production.




Five challenges with open source cloud infrastructure tools

Open source cloud infrastructure platforms like OpenStack offer many benefits to an enterprise -- but first, IT must clear these five adoption hurdles.

Public and private clouds don't materialize by accident. Both require tools to automate, orchestrate and manage vast pools of compute, storage and network resources. And while vendors are eager to sell proprietary cloud infrastructure tools, the open source community has stepped in to supply an array of alternative platforms, including OpenStack, CloudStack and OpenNebula.
Open source cloud infrastructure tools are cost-effective and highly extensible, but they also pose challenges. Here are five common issues users experience with open source cloud infrastructure tools, along with strategies to avoid them.
1.   IT silos cripple deployments
With cloud computing tools and platforms, organizations can find, organize, provision, optimize and monitor data center resources with a high degree of autonomy -- a concept known as user self-service. This model contrasts traditional IT organizations, where separate groups manually perform disparate tasks, such as server setup or network configuration.
Even though cloud infrastructure tools like OpenStack, CloudStack andOpenNebula can streamline cloud orchestration and automation, the tools themselves are far from automatic. Extensive setup, integration, workflow definition and other detailed efforts are required, and businesses may use them in different ways.
To successfully deploy an open source cloud infrastructure tool, combine the expertise of your server, storage, network and business teams. Remove all silos before starting a cloud project. And remember that cloud infrastructure deployments aren't fast or easy; they often require a majorshift in corporate culture and management, and may require staffing changes.
2.   It's still early days for open source cloud tools
New technologies experience growing pains, and open source cloud infrastructure tools are no exception. As a result, these tools can present a unique set of challenges.
For example, a single OpenStack deployment might involve a number of OpenStack components, such as Nova for compute, Swift for storage, Heat for orchestration and Neutron for networking. And this doesn't even account for supplemental issues, such as high availability and access to back-end databases, which can further complicate cloud infrastructures.
Also, the open source community's agile development efforts aren't always smooth or bug-free. Features may not integrate or function properly, and there is no guarantee the community will address these issues -- at least not in the timetable you need. In addition, don't expect the polished manuals, documentation and support that normally accompany a vendor's product.
Careful evaluation and proof-of-concepts are crucial for open source cloud infrastructure platforms. Know the tools and their individual components before planning a cloud project.
3.   Integration is a doozy
Open source cloud infrastructure tools like OpenStack, OpenNebula andCloudStack are comprehensive platforms for enterprise cloud deployments. However, they are just one core part of the environment --middleware -- and each platform demands customization and integration with other front-end and back-end software.
Typical integration involves clustering, monitoring, provisioning, configuring, logging and alerting. For example, cloud infrastructure middleware almost always needs access to a back-end database like SQL to store configuration details. Clustering, meanwhile, might require integration with tools like Pacemaker or Corosync. And the list goes on.
Ultimately, deploying an open source cloud infrastructure platform and integrating different front- and back-end components requires significant knowledge of hardware and software. Changes to any element can impact the entire cloud environment. Integration is a huge project that takes a team to implement properly, and additional staffing may be needed.
4.   Security and performance 
Open source cloud infrastructure tools aren't always configured for security. For example, OpenStack defaults typically allow for non-secure sockets layer (nonSSL) endpoints, but this can open major security vulnerabilities in Keystone, the OpenStack identity and authentication service. As a result, it's crucial to add SSL encryption to Keystone instances. Also, turn off the debug mode in Keystone and other modules after deployment, and change passwords from their default values. Always review and implement security best practices for all cloud and software components.
Likewise, cloud infrastructure deployments don't guarantee optimum performance for servers, storage, networks and software. Once the environment is secured, optimize or tune performance at the basic input/output system, server, hypervisor and cloud component levels. Errors and alerts in management or agent log entries can provide valuable clues about where to optimize. For example, setting a server's power management or CPU to maximum performance can maximize hardware performance.
5.   It's free until it isn't
Never opt for open source cloud infrastructure tools based on cost alone. The tools may be free, but the effort needed to customize, integrate, tweak and troubleshoot open source code can make the cloud project costly and time-consuming. A successful cloud project -- especially on an open source platform --requires broad knowledge, copious testing and timely support from IT, as well as the broader open source community. Even then, open source cloud deployments can be onerous, and professional services or consultants are needed to help move the project forward.

Tuesday 8 September 2015

How to assess the risks of cloud malware

Hosting apps in the cloud does not necessarily create a cloud malware risk. By focusing on new interfaces a cloud migration creates, you can make the cloud as secure as your data center.

IT executives and cloud planners have enough to worry about in the area of application and data security without hearing about new threats in the cloud. Is your information more at risk to malware in the cloud? Do you need new protection measures or tools? To decide, baseline your current risk, secure your cloud management interfaces, review the security of your cloud provider's architecture, and focus attention on special-risk cloud relationships and services. Through all of this, focus on dealing with the new or "incremental" risks the cloud creates, or you'll chase security issues forever.
Was your pre-cloud environment protected?
The first question, and one often overlooked in cloud malware threat assessment, is whether your pre-cloud environment was adequately protected against malware. The most effective way of addressing the security risks associated with a new technology or hosting option is to ask the question, "What is my incremental risk in the new application framework?" "Incremental risk" means risk that wasn't being faced (and accepted) before.
Most malware is introduced not into server application components but into client and user systems. Those systems have to be protected as your first line of defense, no matter where the applications are hosted. Take the time to do a complete audit of your measures to protect users against malware, including BYOD policies, virus scanning of systems on a regular basis, and scanning of emails and assessments of risk on websites accessed using devices also used for work. There is little point to assessing cloud malware risk if you haven't controlled your client-system risk.
The second step is to understand what cloud malware actually means and does. Many botnets and other hackers' tools are hosted in the cloud today, but those don't necessarily threaten your cloud applications any more than they would threaten the same applications running in your data center, and they can't be controlled by you in any case. You should focus instead on cloud management system security, and on "crosstalk" within the cloud that could put your applications at risk.
Helpful measures to protect against cloud malware
Anyone who can access your cloud management system's user interface can deploy something in your cloud or potentially change something already there. That means that these CMS interfaces have to be among the most secure in your business. You must limit the number of users with access and you must insist that access be made through "clean" systems used for no personal purposes and with no access to standard Internet sites or email, and that all changes to the cloud made through the interfaces be recorded and audited.
"Crosstalk" is a source of cloud malware risk that's often a concern to users. Unlike your own data center, a cloud runs applications from others, and some of those apps could be malware. To prevent this malware from infecting you within the cloud, there are three helpful measures:
1.      Run virus scans on your application images in the cloud, just as you'd do for applications running on your own servers.
2.      Make sure that your cloud provider's architecture isolates applications at the network level. A good cloud service will give your applications "private IP addresses" and map them to public addresses only where access is needed. Connections among components inside the cloud should be kept on private addresses where possible to ensure that others can't make the connections.
3.      Access your applications through a virtual private network, either anInternet VPN (IPsec, SSL) or a facility VPN offered by a service provider. This prevents others from creeping into your applications through an Internet link.
Your public addresses for cloud applications or components have to be subject to special monitoring and security. Most companies can detect attacks on their own data center resources because the attacks enter their own networks at some point, and traffic can be detected. Your cloud applications can be reached without entering your data center, and you may not see the traffic. Use all available statistics on your applications to assess the traffic patterns at your application access points, and watch for signs of an attack. If you see unusual activity report it to your network operator and cloud provider, but increase the rate at which you scan your cloud apps for malware as well.
When assessing cloud malware risk, beware of partners
The final point in addressing cloud malware risk is to beware of your partners. In many cases, one of the reasons for hosting an application in the cloud is to facilitate access to the application by customers or trading partners. This access is "new," and thus may not be fully secured. It will always present a risk of malware.
How does your company assess cloud malware risks?
Technology to secure Web portals is fairly well-known, and standard measures for application security can be applied to customer and partner portals offered through Web servers. A key element in this security is to provide a Web front-end and a back-end application server that does effective transaction editing before the data is moved from the cloud into the data center, or processed by a cloud application.
If you're using any direct application link with a customer or partner -- either a formal standard like Electronic Data Interchange (EDI) or just an informal exchange of XML or JSON structures to move business information -- you should ensure all these exchanges are auditable. EDI networks will provide audit services to allow transaction sources and times to be logged and reviewed, but bilateral connections with partners will depend on your own audit trail. Be sure that both sides collect transaction data and settle and match the data at least weekly, and daily if volumes are high. That will alert you to the possibility that an intruder is accessing what's intended to be a trusted customer and partner link in the cloud.