As reported in an article published in the pages of the New York Times, research experts conducted an experiment in which it was planned to find out whether wireless devices can be used for communications between dedicated servers, data center in the future or not.
The researchers set out to find out whether the use of wireless systems will speed traffic between servers, data center , if the basic (cable) system is overloaded in the near future. Several server racks in one of the centers were equipped with small corporations directional antennas for wireless devices and switching, which were installed on top of the server racks.
Used for communications range of ultrashort radio waves: The frequency – 60 GHz, the wavelength – about 5 millimeters. According to the researcher, the detailed system of wireless communications has allowed to significantly accelerate the speed of communication between the racks – from 45% to 95% depending on the specific experimental conditions.
It is known that wireless communications are often not completely reliable. Communication may be interrupted, for example, by the inclusion of a microwave oven or by unfavorable conditions for admission to a particular point. So it is the fact that this idea will be accepted by most operators or the data centers. Nevertheless, the situation in case of data centers is radically different from the usual situation with unreliable mobile phones with unstable connections to Wi-Fi.
The fact is that, the entire situation within the data center is under strict control, all occurring processes are well predicted, and the equipment is serviced by staff for uninterrupted services. In addition, this system uses directional antennas, ie relationship between the switching devices is carried out through the narrow beams of radio waves.
Often we discuss technology trends and the changes they cause. We debated the technological complexities involved and estimated its rate of market adoption. But most of the technical articles forget one key factor: the driving force behind the adoption of any technological change is the economic factor.
When looking at the future of IT, we can learn something from the past and how organizations and society invested in these technologies. In the past, a few mainframe companies had access to computers (the computers were too expensive) and the expectation was that IT would enable them to automate their business processes, to make them faster and cheaper. Later we saw the emergence of the distributed model, client-server, which cheapened the cost of acquisition of technology, allowing you to create solutions aimed to generate more speed and functionality demanded by specific Departments. However, the rapid proliferation of different systems has created a demand for integration that culminated in the emergence of the ERPs.
The situation today is very different from years ago. The Internet is already a part of our daily life and is ingrained in the IT business. The companies did not begin to settle more in just reduced costs. This is “business as usual”, the duty of every self-respecting manager. IT has already done much in this sense, “how to create shared-services center and consolidate its data centers?“. IT now has the opportunity to be looked at from the perspective of revenue generation and as a platform for creating new business and support growth strategies and not just an operational area.
To view IT as revenue generator is a different “mind-set“, because IT was always seen as supportive for the business, but now, it generate new revenue. Now IT can be seen even as its source of income. The adoption of IT cloud computing allows you to generate revenue and profits. We can start talking at the end. Cloudnomics.
Cloudnomics can be translated as a new economic model for IT, where metrics like TCO lose enough of its importance and IT begins to be seen through the eyes of a business case. In fact, any business is started to go forward, generate revenue and profitability!
As cloud computing enters this process?
With public cloud. IT will no longer install, configure and upgrade physical servers, and the standardized and automated processes that characterize a cloud environment, the number of technicians dedicated to support greatly decreases. IT can focus on innovation and value creation for the company. The cost models also change with cloud and its concept of elasticity. You pay for the use of resources consumed, which can be directly linked to revenue generation, increased use of IT, more revenue generation. It is an economic model that changes the rules of the game. It costs the same to rent a server for 1000 hours to 1000 servers for one hour.
A practical example of how the current IT model limits the generation of significant revenue: To explore new business opportunities that have a short life. In the current model, it is not economically justifiable to purchase a technology platform and put it into production (with high up-front investment) for it to operate for only a few months, taking advantage of a unique business opportunity. The bill probably will not close. With this cloud, it is perfectly possible. A simplistic example, but it shows the idea: producing an animation, where the defendant is a huge computing power to render final film, much more than the sum of all previous months of production and that after the closure dispensing with all computers. Cloud computers are allocated as they are needed and that there is no turning off computers . The provider, in turn, also enjoys economies of scale, keeping thousands of servers to be shared by hundreds or thousands of customers.
The conceptual shift is much larger than technological change. Comes the entrepreneur CIO, attached directly to the CEO. A profile is much less technical and more focused on business and entrepreneurship. In fact, a suggestion of an MBA could be “Entrepreneurship in IT” .
Among the structural changes to IT, we see the start of an organization aimed at supporting other sectors of the company to a revenue-generating industry and the transformation of an organization focused on building activities and support systems and computing power for an organization aimed at creating a computing platform where new businesses will be generated. The choice of applications and consequent utilization can be moved to their users.
This new IT may act as an incubator for business start-ups within the company.
Finally, IT is a business unit which is simple to write, but hard to put into practice. Sure, it’s not going to happen from one day to another, but a process that will happen over the next year. But can get started today.
While in recent months, access to applications is offered in pay per use model, being hosted on shared data centers (which is known as cloud computing), (arguments) continue flying over the issue of security as a recurring objection to the general expansion of the use of the cloud.
Without wishing to deny the existence of risks, try to propose the following hypothesis: will have to set optimal levels of security that do not impede or complicate access and use of software applications and services.
There is a maximum security level, which is credited with 100% confidence that there will be no risks: always keep your computer off.
Most studies conclude the existence of quasi-apocalyptic risks that are computer security companies or service providers that impact on their levels of safety and business case.
Those who seem to have the most spyware code distributed around the world are governments, who in turn create the laws for the security checkpoint.
The group of non-internet users is precisely the one that seems to have security issues.
The setting of restrictions on access to internet sites has meant that, in most large organizations, employees have not been able to access critical information necessary for their work when they are needed.
The use of antivirus, antispam, … and other security services consume much resources and slow down the PC so that large numbers of personnel management and maintenance of networks and teams in organizations chose to uninstall them and have a copy of their data and their applications, which are immediately restored on regular basis.
While centers reported that high level of security has localized malware code, which is not on specific devices, internal network or don’t know what type of malware it is, so presumably it exists, but their relation to any risk of physical security is virtually nonexistent.
The greatest risk for loss of information lies in overheating of the machine where it is stored. I would not be mistaken if I say that the control measures and physical security of any data center operating internet is much higher than any company that maintains the machines on their premises.
The above statement is also valid for the control and limitation of physical access to machines.
To maintain the technology infrastructure at home is a decision being reconsidered by the companies, particularly small and medium-sized. Apparently, the computing power is no longer attached to a box, and has jumped into the clouds. The investment in solutions for the data center is high and pay per use is very attractive. This is an important lever in the model of public cloud infrastructure as a service (IaaS).
This year, growth will be even greater. The average expansion year after year is 64%, which has hooked more users so far.
In addition, 41% believe that infrastructure as a service model will be growing more in the coming years. IaaS will reach maturity in the next two years, but must go through a phase of cultural change to take off. There are vendors that point cloud as a form of outsourcing. But it is not. It takes a break from that thought.
The difference lies in customization. Cloud mass is the solution that is equal for all, not outsourcing. That’s why cloud is half of the value of outsourcing. Another point of attention in the industry is in relation to the offer. While security is still pointed out by companies and analysts as a red alert for the adoption of cloud.
Today, we have more than 33,000 customers worldwide, including users of dedicated servers, cloud computing, e-commerce and etc. The offer we have today is focused on large companies and have service level agreements that monitor downtime, response time, etc. However, the use of public cloud is still low, but the trend for the next year should be a combination of private and public cloud.
We believe in the growth of hybrid clouds. It includes mobility products, software, service, and we are increasingly following the path as a service. In November 2011, we launched eNlight Cloud Computing Solution that deliver Cloud Computing Services around the world.
The cloud computing is based on a delivery model technology in which the infrastructure is not the client, but on remote servers that offer large capacity processing and data storage resources that may be shared (in the case of smaller customers) or be of use only if the contractor needs.
In theory, this ‘cloud‘ eliminates the need for companies to have to worry about the acquisition and maintenance of application specific hardware, or group of applications. Resources are acquired (hire as a service) as the need arises and paid as per use of resources.
This business model is still in its infancy. The expert says that over the next few months, companies will focus on three points related to the cloud : understanding how to store static files; testing services and try to get some applications to the cloud without having to change them.
Listed below are four questions that often arise when talking about cloud computing. Check the answers.
Does the adoption of cloud computing affect the obligations of companies regarding the preservation of data?
Companies must follow specific policies for the preservation of their sensitive data. This is because, especially in sectors that deal with sensitive information of customers, you need to store data that can be used in the future, in the case of a judicial investigation. So, before sealing agreements for cloud with a supplier, you must ensure that such service providers are able to store confidential files. All specifications relating to protection policies, storage and retrieval of data should be expressed in formal contracts between the parties.
Stored data in “cloud” is more vulnerable?
When a company is using cloud computing services, it is creating another source of access to their data. But it does not necessarily mean that information security is impaired.
Can you preserve the confidentiality of any data?
Yes, the rights of providers to access customer information may be limited by contractual clauses. However, it is necessary that the provider has a minimum level of freedom to act efficiently and therefore extremely sensitive data should not be in the cloud, and yes, the company’s internal infrastructure.
What steps should be followed after the choice of a reliable cloud service provider?
Adopting the model of cloud computing can be a good opportunity to structure a program for data retention. That is, after choosing a service provider, the company has no policies for storage, query and retrieval of information should set them to map all of its assets. This step is important to create an inventory that indicates where and how each corporate information is stored and manipulated.
The topic cloud computing, still demands a lot of debate and conflicting opinions, it is already becoming reality. Every day we see the ecosystem built around cloud computing and to consolidate, more and more success stories are published.
I will not quote statistics and forecasts that always come from industry analysts, who provide these stats and estimates agree with each other in numbers.
The three layers of cloud, IaaS, PaaS and SaaS can be viewed as a hierarchy, where the lower layer has IaaS, above it we have the top SaaS and PaaS. The upper layers are built upon the layers below. The benefits obtained are directly related to the layer. That is, the higher the layer, the greater the potential benefits. IaaS can be considered as the commoditized layer, as it basically provides a virtual infrastructure, the users are abstracting the physical equipment. But offers no content. SaaS, in turn, enables a higher level of abstraction, because the User sees only the features of the software without needing to know what technology it uses and need not even bother with version upgrades.
The most emblematic example is the force.com that lets you create applications that extend the functionality of the salesforce. We will see later PaaS is consolidating itself, with its own technologies, separate from SaaS vendors. This will happen with maturity in the use of cloud services, when companies use the SaaS PaaS coupled to identify who will be imprisoned on these platforms.
But it is indisputable that we are still learning to exploit the potential of cloud computing and we will learn much more in the coming years. The first projects have been exploratory, which is natural. What we will see this year? Clouds filled with typical workloads to be outsourced via SaaS and on-premise applications transferred to IaaS clouds. But, although limited in their impact, are paving the way for the full adoption of the model. In fact, the cycles of technological change takes take several years to mature in 2020 and probably cloud computing is commonplace. But if this will happen in 2020, the first steps should be taken now in 2012. Cloud computing is a reality now and should already be on the radar of the IT managers of all firms.