According to a study, international provider of ERP and enterprise software companies are increasing investment in technology, Cloud for back-office applications, but growth will remain stable this year.
The study was based on responses from 700 medium and large dynamic organizations from 12 countries.
Almost half of respondents (334) mentioned that at this time that they does not use any Cloud application for back-office, and almost one third (222) said that between 1% and 25% of their back-office functions on Cloud. 32% of respondents believe that Cloud Computing will invest more this year and another 32% said it will invest at the same pace as in 2010.
The growth potential in the enterprise space is very important. Six respondents from companies of different sizes, geographical locations and sectors stated that their back-office functions are 100% based on Cloud.
This clearly exposed the need to present a clear and unified concept, after presenting the different alternatives that allows for potential customers in this model can be identified easily.
Although 123 organizations of public sector mentioned that they currently use low cloud model, and although relatively few have been planned investment for 2011, 25% believe that this model represents between 25% and 50 % of its back-office applications in ten years, along with the 17% estimated to be between 50% and 75%. The main perceived advantage of Cloud technology, according to respondents in the public sector is its ease of maintenance.
According to the study, 24% of respondents are already using an application on Cloud for accounting , ahead of any other process, while little has been invested in other back-office applications, including managing the supply chain and production. Surprisingly, only 8% of respondents have invested in technology-based CRM Cloud.
Only 9% expected to remain at 100% applications in its facilities, while 8% expected to be completely based on Cloud. It seems that in the next 10 years, a hybrid approach will become the most popular model, and that 83% of respondents said they expected to have a technology mix that will combine applications and popular Cloud services.
Call it whatever you want: Cloud Computing , SaaS, PaaS or IaaS, but the software distributed via the Internet is easily scalable and marketed under subscription to which we refer in this article as “Cloud Computing” is today the fundamental reality, the world of enterprise software (and the software world in general).
The approach passes through the question:
” What kind of ecosystem of manufacturers, service providers and customers is emerging as the winning presence Cloud Computing? “
On the one hand, from the viewpoint of customers and cloud providers, the benefits of Software as a Service and Cloud Computing are clear. Moreover, for providers of services (consulting firms, integrators, etc …) are trying to adapt to the changing needs of a market that has changed and competitive differentiation. Now that there is still a perfect opportunity to position.
For the customer, the determining factor is that you hire a service, and don’t purchase a software. The manufacturer should be happy if the client wants it to continue paying your monthly fee. This implies a change in my view, to a commercial and social relationship between the two.
For the manufacturer , the processes are simplified, with corresponding economic benefits that this entails. But, especially at this early stage, will have to overcome significant challenges such as security, standards …
For the service provider, that is, the consulting company that created customized solutions, the Cloud Computing represents both a threat (in the event that the product does not require too much can be deployed as a SaaS CRM such as Salesforce ) and an opportunity.
In conclusion: In my opinion, the Cloud Computing will not radically change the type of players that exist in the enterprise software sector, we will continue with the customers interested in obtaining the best solutions tailored to their needs at a reasonable price, with manufacturers compete to offer the most competitive, and integrators who seek to offer its customers the most appropriate software and be used.
I think the question is, while developing this change to the ubiquity of Cloud Computing, who will use it and thus be better positioned for the new ecosystem is emerging, and who will miss the train.
It seems that this year’s buzzword in the IT sector is the cloud computing or cloud services. We at ESDS Software Solution Pvt. Ltd., are working on adding the various cloud services available in India and worldwide, when we get ready, we will announce it here. Until then I will left some things that can help everyone to better understand the state of development of this technology, several articles are written recently on his blog.
Platform as a Service (PAAS) on the Cloud Services: The main idea of a platform as a service is the number of “layers” that gives the developer when building an application on a third party application: not only solves the problem of hardware infrastructure – machines, bandwidth, scalability, availability – but also several layers of software infrastructure.
Infrastructure as a Service (IAAS) in the Cloud Services: the basic idea is that of outsourcing of servers to disk space, database and / or computing time, instead of having complete control of them in the data center, the company or opt for a data center and only manage it.
eCloud, middleware in Cloud Services : eCloud is not just a product to make a “new middleware in cloud, the idea is to provide opportunities for “private clouds” (who provides the resources that are within the corporate firewall, optimizing depends on demand) .
The problems of Cloud Computing : With several large software companies with products or software-based projects in the cloud, it’s time for taking a look at the uncertainties, risks and shadows that accompany this “new paradigm” for lockin the provider, control, return on investment and centralization among others.
On May 31st, we arranged a Working Breakfast within subject line technology, and more specifically, on Cloud Services and SaaS (Software as a Service).
The report has been generated from the conclusions reached in the debate which arose between the leading professionals of the sector, reveals the rapid evolution of what until recently was a trend but now becomes a must-have in most companies and agents in the ICT world.
Many see the ASP model as the precursor of what is now known as Cloud or cloud applications. Two key factors that have contributed to a better and more accepted implantation is the evolution in networking and connectivity, as well as a more effective development of software management layer.
One of the strengths of Cloud Computing is the scalability, and this is an important ally in the NoSQL databases. In most current projects, the case is not a limitation in its databases MySQL by size, but rather by poor design.
Experts say, the technology is more than ripe for the development of cloud and SaaS in India, and that the main barriers are in the legal inequities as when competing with foreign companies who seek and have customers in India.
Software companies that are focusing on creation of online applications and products based on cloud, are adapting new model of development that is more efficient and flexible, as is to have teams that are capable to work with the agility and efficiency that puts a halt to large development companies or consultants.
Software as a Service
The new generation of smartphones have greatly contributed to the development of online applications, focusing on simplicity and usability. Likewise, the emergence of Stores for apps, have helped to bring the available supply of apps to the SMEs , to achieve some momentum to this emerging sector in India.
Improved interface for the developer, and the creation of environments such as Amazon or Google Apps have made it more accessible entry into the online application market, and barriers have been removed so that even a young developer can successfully launch an app.
One of the unfinished business of SaaS is integration with other management tools the company has. SAP has great advantages when integrated with all the tools, and the move to SaaS often means losing this advantage, resulting in a reluctance by companies that already have a management platform in place.
IT departments of medium and large companies are lagging behind because there are very agile external solutions. Where an internal department can complete a job offer in months, an external one can do so in weeks.
The market is growing both nationally and internationally.
It is beginning to extend a very interesting IaaS standardization, but in terms of PaaS, it is still long way to go. In fact, building applications on Azure or Google Apps can be a trap when you want to move to another platform. Open Stack is the basis IaaS to create a more open PaaS.
Microsoft is also deploying its own public cloud and Yahoo is also seeking its place in the cloud. But the factor that will help an explosion of demand will undoubtedly be interoperability between clouds.
The investment needed to upgrade or set up a good cloud infrastructure is high, but in the first quarter of 2011 there has been a major general movement of hosting providers and service companies are already offering this option to their customers, most sometimes demanded by them.
In general, it is anticipated that all operators end up supplying their Cloud platform, but will live on only those who get a significant critical mass of customers due to high maintenance costs.
But the revolution in the Cloud is not a path that can be walked alone, and ESDS Software Solution Pvt. Ltd. India, after cannibalizing some of its new solutions, is arousing with a new technology for better future in Cloud Services.
Some days ago, a get-together was arranged in our company and on that occasion, touched the theme of hosting, ISPs and Cloud Computing. From the discussion, we present the conclusions of this event in which representatives of the hosting market, data centers and cloud computing were talking.
This concept came to power in 2008 in the Anglo world. Its implementation is greater in the U.S. and the largest investments are occurring in Asia. You could be implementing these solutions directly in the process of incorporating the Internet. In Europe there is a delay and investment is more discreet. In part, this may be due to a distrust of the existence of a real demand against expectation, somewhat inflated, which has been created.
Some sectors are still wary, and on the other hand, there is really an opportunity for real Cloud in India, because companies are making the necessary investment for standardization.
Hosting And Data Center Trends In India
While some see the Green Hosting as a fad, some voices say it is a necessity dictated by the efficiency associated with new technologies and increasing storage demand in business. Today, the most popular way to reflect this efficiency is through the Power usage effectiveness (PUE). And while the equipment manufacturers are increasingly earning more demanding and therefore more consumption, it is also true that these devices allow concentration and cooling solutions that affect efficiency.
Although the CTOs of companies are uneasy about the outsourcing for fear of losing certain amount of control or power over their area, the crisis and the need to save in HR is leading the change. The enterprise solution based on Plesk panel is simple and works very well for the end customer. For more complex solutions, hosting providers have to provide a higher level of administration.
Hosting Market Situation in India
Hosting providers recognize that much of hosting you hire in India is provided by local businesses or professionals who are not able to give good service. A criterion for recognizing a good deal of hosting is a Customer Service quality, although it is difficult to agree to the setting of minimum requirements in order to consider a hosting provider as such.
The Hosting Market in India is growing, albeit at a rather low (between 5% and 10% annually), but the new expansion coming and associated internet e-commerce is already a major source of important business for the hosting.
A key growth driver for a new hosting provider is the post-sale service and support. Along with the Business Breakfast, we discussed several cases and examples that reflect the value of the customer that gives the over moderate service and differences in price.
This year is predicted to continue the process of concentration on hosting companies in India with new acquisitions. This will gradually reduce the number of “players” in the market while they grow stronger.
Security & Trust
The hot topic right now in India in the concept of security in the hosting is the conjunction between Cloud Computing and Data Protection Act. The possibility that foreign servers creates anxiety and mistrust.
In terms of technical security, almost all hosting providers agree that there is less of DDoS that is involved, and that over 90% of intrusions are due to irresponsible use by the customer.
They are also unanimous in stating that attacks against net neutrality by the big telcos or governments, the network will respond with new resources as encrypted connections or changes in provider access.
The CTOs show some misgivings about the so-called “digital natives”, since most stays in the outer layer of the technology they use. The advantage is that they have a high degree of adaptability and learn quickly. Therefore, if the IT department has always documented everything, it will avoid a long and troublesome process of learning.
The forecasts in relation to the salaries of IT professionals are to rise .
It is relatively common for startups to request servers for free in exchange for benefit of the impact, this might happen when the company reaches a success stage. But this happens very rarely. In general, startups need to over tighten the margins, and do not typically demand a higher volume of servers like the other customers.
Finally, we discussed the upcoming trends: Virtual Desktop Infrastructure (VDI), audiovisual content and products for applications in the cloud type DropBox , some of the applications that are carried to the cloud, to consolidate applications that had no competition until now.
Vmware, which is into virtualization in the cloud has just released still in beta, the new platform as a service ( PaaS ), Cloud Foundry.
The current leaders of the market in PaaS solutions are Microsoft Azure and Google App Engine. The novelty of Cloud Foundry is presented as open source and can support multiple frameworks, Cloud providers and application services.
Its utility is that it shortens the time needed to design an application, build the code, and finally transfer it to the cloud, using a PaaS solution open.
There are three types of services offered in the Cloud Computing
Infrastructure as a Service (IAAS), which offers storage resources CPU, memory and network so we can install the operating system of our choice, and develop the Framework you want.
Platform as a Service (PaaS) , which provides an operating system directory and an environment in which it develop a service like Microsoft Azure and Coud Foundry.
Software as a Service (SaaS) , which provides software directly without having to worry about the platform it is developed on, or infrastructure where you use Google docs.
The tools are used on Spring Source platform ( acquired by VMware in 2009) Developer Java , Rails and Sinatra Developer Ruby, and Node.js and other JVM frameworks including Grails.
Cloud Foundry has a good degree of portability, allowing developers to migrate applications across environments on cloud providers and enterprise data center without disrupting or modifying the application.
The platform is not tied to any particular environment, supports private or public clouds, including those deployed in VMware vSphere the vCloud developed by VMware, VMware and public clouds.
In 1983 Apple introduced a great innovation, the Lisa, which used a graphical interface quite elaborate and had a suite of office applications to the Office there. The interface worked well, applications ran with a surprising performance and configuration was far superior to the PCs of the time. The problem was the price: $ 10,000 at the time:
Although there was nothing better in the market, Lisa ended up not reaching the expected success. In total, they produced about 100,000 units in two years, but most of them were sold at deep discounts, often below cost price. Because Apple has invested approximately $ 150 million in the development of the Lisa, the account ended up in the red.
Nevertheless, the development of the Lisa was the basis for the Macintosh, a simple computer, released in 1984. Unlike Lisa, it made a great success, even threatening the empire of the PC setup as it was similar to the PCs of the era with an 8 MHz processor, 128 MB of memory and a 9 inch monitor. The big gun was the Macintosh MacOS 1.0 (derived from the Lisa operating system, but optimized to consume much less memory), an innovative system from various points of view.
Unlike MS-DOS it was entirely based on the use of the graphical interface and mouse, which made it much easier to operate. MacOS continued to evolve and incorporate new features, but always keeping the same idea of user-friendly interface.
During this same period, Microsoft had developed the first version of Windows that was announced in November 1983. Unlike the MacOS, Windows 1.0 was a rather primitive interface, which did little success. It ran on MS-DOS and Windows applications could run as much as the programs for MS-DOS. The problem was the memory.
The PCs of the era came with very small amounts of RAM and at that time there was still no possibility to use virtual memory (which would be supported only from the 386 ). To run Windows, it was necessary first to load MS-DOS. The two together have consumed virtually all memory of a PC base of the season. Even the more bulky PCs could not run many applications at once, again for lack of memory.
As Windows applications were very rare at the time, few users saw the need to use Windows to run the same applications that were running (with more available memory …) in MS-DOS. Not to mention that the initial release of Windows was very slow and had too many bugs.
Windows started to make some success in version 2.1, when the computers with a 286 MB or more of memory were common. With a more powerful configuration, more memory and more applications, it finally began to make sense to run Windows. The system still had many problems and crashed often, but some users began to migrate to it. The Windows 2.1 version also won one for the PCs with 386 processors, with support for memory protection and use of virtual 8086 mode to run applications in MS-DOS:
Windows even spawned from version 3.11 (for Workgroups), which was also the first version of Windows that supports network shares. It was relatively mild in terms of PCs of the time (the 386 with 4 or 8 MB of RAM were common), and supported the use of swap memory, allowing multiple for applications to open, even if the RAM runs out. At the time the application base for Windows was already much higher (the titles were used more as Word and Excel, not much different from what we have on many desktops today) and there was always the option to return to MS-DOS when something went wrong.
At this time the PCs began to regain ground lost to Apple Macintosh’s. Although it was unstable, Windows 3.11 had a large number of applications and PCs were cheaper than Macs.
Strange as it may seem, Windows 3.11 was still a 16-bit system, which operated more like a GUI for MS-DOS rather than as an operating system itself. By connecting the PC, MS-DOS was loaded and you had to type “win” at the prompt to load Windows. Of course, you could set your PC to boot Windows by default, adding the command to autoexec.bat, but many (possibly most) preferred to call it manually, to get how to fix things by DOS if Windows passed to curb during charging.
In August 1995 it launched the Windows 1995, which marked the transition from platform to 32 bits. Besides all the interface improvements and the new API, Windows 1995 brought two important changes that were supported preemptive multitasking and memory protection.
Windows 3.11 was using a primitive type of multitasking, called cooperative multitasking. In it, no memory protection was used, there was no real multitasking, and all applications had full access to all system resources. The idea was that each application uses the processor for some time, then passes to another program and wait for your turn to come again to run over a handful of operations.
This anarchic system gave room for all sorts of problems, since applications could monopolize processor resources and end up halting the system completely in case of problems. Another big problem was that, without the protection of memory, applications could invade areas occupied by other memory, causing a GPF (General Protection Fault “), the general error that produced the famous Windows blue screen.
The memory protection is to isolate the area of memory used by applications, preventing them from accessing areas of memory used by others. Beyond the question of stability, this is also a basic function of security, since it prevents malicious applications that have easy access to data handled by other applications.
Although it was supported from the 386, the memory protection was now only supported from Windows 95, which also introduced the already mentioned preemptive multitasking, isolating the memory areas occupied by applications and managing the use of system resources.
The big problem was that, although it was a 32-bit system, Windows 95 still had many 16-bit components and maintained compatibility with 16-bit applications of Windows 3.11. While they only used 32-bit applications, things worked fairly well, but when a 16-bit application was executed, the system returned to use cooperative multitasking, where it found the same problems.
At the time, Windows was facing competition from OS / 2 from IBM, which glowed like a more robust option, especially for the business audience:
OS / 2 arose from a partnership between IBM and Microsoft to develop a new operating system for the PC platform, capable of succeeding in the MS-DOS (and early versions of Windows, which ran over it). Initially, the OS / 2 was developed to take advantage of features introduced by the 286 (new instructions, supporting 16 MB of memory, etc.). But it was quickly rewritten as a 32-bit operating system, able to enjoy all the features of 386 onwards.
However, the success of Windows 3.x that Microsoft made to change its mind, going to see the Windows (not OS / 2) as the future of the PC platform. Tensions went on increasing, until in 1991 the partnership was broken, which led to Microsoft and IBM to become competitors. IBM continued to invest in the development of OS / 2, while Microsoft used the code developed to initiate the development of Windows NT that was developed in parallel to Windows 95.
Although OS / 2 was technically superior to Windows 95, Microsoft which was just getting better, because Windows 95 was easier to use and had the users’ familiarity with Windows 3.11, while IBM skidded on a combination of lack of investment, lack of support for developers and lack of marketing.
Thanks to the terms of the previous agreement, IBM was able to include support for Windows applications on OS / 2. However, the strategy backfired, and it discouraged further development of native applications for OS / 2, so it ended up competing with Windows on its own territory. To run Windows applications inside of OS / 2 was more problematic and the performance was lower, causing more and more users prefer to use Windows directly.
Although it is officially dead, OS / 2 is still used by some companies and some groups of enthusiasts. Serenity in 2005 bought the rights to the system, giving rise to eComStation .
A much more successful system that began to be developed in the early 90′s- Linux , we all already know. Linux has the advantage of being an open system, which currently has the support of thousands of volunteers and developers around the globe and the support of companies of weight, such as IBM, HP, Oracle and virtually every other major tech company with the exception of Microsoft and Apple.
Nevertheless, in the beginning the system was much more complicated than the current distributions and did not have the lush graphical interfaces that we have today. While Linux is strong on dedicated server since the late 90s, using the system on desktops is growing at a slower pace.
Back to the history of Windows, with the release of NT, Microsoft has to maintain two separate development branches, one of it with Windows 95, for the consumer market and the other with Windows NT, for the corporate audience.
The tree originated from Windows 95 to Windows 98, 98SE and ME, that despite the progress in relation to the API and applications, have retained the fundamental problems in Windows 1995 for stability. Another major problem is that these versions of Windows were developed without a security model in an era in which the files were exchanged via diskettes and the machines were accessible only locally. This allowed the system to use and swirled with a good performance on the machines of the time, but started to become a heavy burden as access to the Web became popular and the system started to be targeted by all sorts of attacks.
The tree of Windows NT on the other hand had much more success in creating a modern operating system that was much more stable and with a more solid foundation from the point of view of safety. Windows NT 4 gave rise to Windows 2000, which in turn was used as foundation for Windows XP and Windows Server 2003, kicking off the current line.
Apple, in turn, has undergone two major revolutions. The first was the migration from old MacOS to OS X, which, below the polished interface is a Unix system, derived from BSD. The second happened in 2005 when Apple announced the migration of its entire line of desktops and notebooks to Intel processors, which allowed it to benefit from cost reduction in processors and other components for personal computers PCs, but at the same time retain the main difference of the platform, which is the software:
From the viewpoint of hardware, Macs today are not much different from PCs, you can even run Windows and Linux via Boot Camp. However, only Macs are capable of running Mac OS X, due to the use of EFI (which replaces the system BIOS as the bootstrap) and the use of a TPM chip.
Despite the change, Macs remain a closed platform, controlled by Apple, which develops both computers and the operating system. Of course much is outsourced, and many companies develop software and accessories, but as Apple has to keep track of everything and develop a lot on their own, the cost of the Mac ends up being higher than that of PCs.
In the early 80s, competition was fierce, and many felt that the Apple model would prevail, but that’s not what happened. Within the history of computing we have many stories that show that open standards almost always prevail. An environment where there are several companies competing with one another, promotes the development of better products, which creates a greater demand and, because of economies of scale, allows lower prices.
As the computers have a PC open architecture, several different manufacturers can participate and develop their own components based on standards already set. There is a huge list of components compatible with each other, which allows us to choose the best options among various brands and models of components.
Any new manufacturer with a motherboard cheaper or faster processor, for example, can enter the market, it’s just a matter of creating the necessary demand. The competition makes it that manufacturers are obliged to work with a relatively low profit margin, winning based on the volume of things sold, which is very good for us that we buy.