Kategorien
News

Vortrag & Moderation: Public & Private Cloud – Wohin geht die Reise?

Am 05.03.2012 hat René Büst im Rahmen der CeBIT Webciety 2013 einen Vortrag zum Thema „Public & Private Cloud – Wohin geht die Reise?“ gehalten und im Anschluss das Diskussionspanel u.a. mit den Teilnehmern Holger Dyroff (ownCloud), Kurt Rindle (IBM) und Ali Jelveh (Protonet) moderiert.

Hier sind die Folien zu seinem Vortrag zu sehen:

CeBIT Webciety 2013 – Private & Public Cloud: Wohin geht die Reise? from Rene Buest

Die Aufzeichnung des Vortrags und der Moderation sind hier zu sehen:

http://webcast.nc3-cdn.com/clients/CeBIT/2013/03/05/?skipto=6380

Kategorien
Analysis

Top Trend: Web-based GUIs for Cloud IaaS

Many administrators are not developers and do not necessarily know how to use the APIs of current infrastructure-as-a-service (IaaS) providers to programmatically build a complex cloud infrastructure. The next generation of cloud services, especially in the area of IaaS, must and will be much easier to use. In the future we will see more and more cloud offerings that provide in addition to an API a graphical web interface, to „click“ cloud infrastructures together, without having knowledge of the underlying API.

Web-based GUIs for cloud infrastructures

Pioneer is the German company openQRM Enterprise. Since 2009, they provide in addition to their open source cloud infrastructure solution openQRM, for setting up own public and private clouds, also a web-based GUI called „Visual Designer Infrastructure“.

openQRM Visual Infrastructure Designer

openQRM Visual Infrastructure Designer


openQRM Visual Infrastructure Designer

Since May 2012, ProfitBricks, also from Germany, provides its infrastructure-as-a-service solution. Also ProfitBricks has, as the first public cloud provider on the market, a web-based GUI for creating an own cloud infrastructure, the „Data Center Designer“.

ProfitBricks Data Center Designer

ProfitBricks Data Center Designer

Both web GUIs, the „Visual Designer Infrastructure“ and the „Data Center Designer“ help cloud administrators establishing and managing common up to complex cloud infrastructures.

Open source cloud solutions like Eucalyptus, OpenStack or CloudStack and also public IaaS providers such as Amazon Web Services, Windows Azure and Rackspace do not offer such surfaces, so that administrators are completely dependent on the APIs.

Top trend with more potential

The „Visual Infrastructure Designer“ of openQRM and the „Data Center Designer“ of ProfitBricks do not exhaust ​​their full potential yet, but they show very well in what direction the design and management of cloud infrastructures will develop. In addition to an extensive and well-documented API, web-based graphical management interfaces and infrastructure designer tools are the top trends in infrastructure-as-a-service and belong to the portfolio of each IaaS cloud provider in the future.

Kategorien
Analysen

Top-Trend: Webbasierte GUIs für Cloud IaaS

Viele Administratoren sind keine Entwickler und wissen dadurch zwangsläufig nicht, wie sie die APIs aktueller Infrastructure-as-a-Service (IaaS) Anbieter nutzen sollen, um sich programmatisch eine komplexe Cloud-Infrastruktur aufzubauen. Die nächste Generation von Cloud Angeboten, speziell im Bereich IaaS, muss und wird einfacher zu bedienen sein. Wir werden in Zukunft immer mehr Cloud Angebote sehen, die neben einer API zusätzlich eine graphischen Weboberflächen bieten, mit der sich Cloud-Infrastrukturen „zusammenklicken“ lassen, ohne Kenntnisse von der darunter liegenden API besitzen zu müssen.

Webbasierte GUIs für Cloud Infrastrukturen

Vorreiter ist das deutsche Unternehmen openQRM-Enterprise, das seit 2009 zusätzlich zu seiner Open-Source Cloud Infrastruktur Lösung openQRM, für den Aufbau eigener Public und Private Clouds, ebenfalls den sogenannten „Visual Infrastructure Designer“ mitliefert.

openQRM Visual Infrastructure Designer

openQRM Visual Infrastructure Designer


openQRM Visual Infrastructure Designer

Seit Mai 2012 bietet ProfitBricks, ebenfalls aus Deutschland, seine Infrastructure-as-a-Service Lösung an. Auch ProfitBricks hat als erster Public Cloud Anbieter am Markt eine webbasierte GUI für den Aufbau einer eigenen Cloud-Infrastruktur, den „Data Center Designer“.

ProfitBricks Data Center Designer

ProfitBricks Data Center Designer

Beide Web-GUIs, sowohl der „Visual Infrastructure Designer“ als auch der „Data Center Designer“ helfen Cloud Administratoren bei dem Aufbau und der Verwaltung gewöhnlicher bis komplexer Cloud Infrastrukturen.

Open-Source Cloud Lösungen wie Eucalyptus, OpenStack oder CloudStack als auch Public IaaS Anbieter wie Amazon Web Services, Windows Azure oder Rackspace bieten solche Oberflächen nicht, wodurch Administratoren hier vollständig auf die APIs angewiesen sind.

Top-Trend mit Potential nach oben

Der „Visual Infrastructure Designer“ von openQRM als auch der „Data Center Designer“ von ProfitBricks schöpfen zwar noch nicht ihr volles Potential aus, zeigen aber sehr gut, in welche Richtung sich der Aufbau und das Management von Cloud-Infrastrukturen entwickeln wird. Neben einer umfangreichen und gut dokumentierten API gehören webbasierte graphische Managementoberflächen und Infrastruktur Designer-Tools zu den Top-Trends im Bereich Infrastructure-as-a-Service und gehören in Zukunft in jedes Portfolio eines IaaS Cloud Anbieters.

Kategorien
Comment

One third of German companies use the cloud. Really? I don't think so.

According to a survey of Bitkom among 436 German companies a third of all respondents use cloud computing in 2012. This sounds good at first and shows that the cloud adoption is going upwards in Germany. However, I assume that the number is sugarcoated. No, not by Bitkom itself, but because it is still unclear what cloud computing really means, and most of the surveyed companies have simply said yes, even though they are not using cloud. Support for my assumption I get from Forrester Research.

Survey results of Bitkom

That one in three companies in Germany relies on cloud roughly means a growth of 9 percent compared to 2011. Additionally, 29 percent plan to deploy cloud solutions. Another third sees cloud computing not on the agenda. The survey reveals that currently 65 percent of large firms with 2,000 employees have cloud solutions in the use. The middle class between 100 to 1999 employees is at 45 percent. Smaller companies with 20 to 99 employees cover a quarter.

Private cloud is preferred

Moreover 34 percent of surveyed companies rely on their own private clouds. Compared to 2011, a growth of 7 percent. 29 percent plan to use this cloud form.

Now, let’s come to my assertion that the statement that one third of German companies use the cloud, is sugarcoated. Because what I hear and see again and again, is now also publicly stated by Forrester Research, more precisely by James Staten, who even describes this as cloud-washing. 70 percent of „private clouds“ are no clouds.

70 percent of „private clouds“ are no clouds

The problem is mainly in the fact that most IT administrators continue to lack an understanding of what cloud computing, whether public or private cloud, really means. As James Staten writes, 70 percent of interviewed IT administrators are not aware of what a private cloud really is. Most named a fully virtualized environment already a cloud, which in general does not have the core features of a cloud.

Virtualization is not cloud computing

One has to make clear again at this point, that the mere virtualization of an infrastructure does not makes a private cloud. Virtualization is a subset of cloud computing and a key component. But: The areas self-service, scalability, resource pools, automation, granular billing, on-demand delivery of resources and so on, no ordinary virtualization solution is offering, and only is provided by a cloud infrastructure.

Frighteningly, some vendors are so perky and sold there former on-premise virtualization solutions now as a cloud. The „confession“ I have received from an employee of a very large U.S. vendor, who is now offering cloud solutions. The context in the personal conversation was about „We have adjusted our VMware solutions by simply written cloud on it to quickly have something „cloud-ready“ on the market.

German companies believe to have a „private cloud“

Similarly, I see it with German companies. I would not blame the Bitkom. Finally, they have to rely on the correct answers to the questions. And what should they do if the respondents due to ignorance may answer incorrect by claiming to use a private cloud, even though this is no more than a virtualized infrastructure without cloud properties.

With this in mind, you should see the results of this Bitkom survey critical, relativize it and acknowledge that not one third of German companies use cloud computing.

Update: 12.03.13

I do not want to give the impression that I take my statements out of the air. Yesterday somebody told me that their „Terminal-Server“ is a private cloud. Reason: There are so many definitions of cloud you can choose.

Update: 13.03.13

Also Exchange server with OWA are often named as a private mail cloud.

Kategorien
Kommentar

Jedes dritte deutsche Unternehmen nutzt die Cloud. Ehrlich? Glaube ich nicht!

Laut einer Umfrage des Bitkom unter 436 deutschen Unternehmen haben im Jahr 2012 bereits ein Drittel aller Befragten Cloud Computing genutzt. Das hört sich im ersten Moment viel an und zeigt, dass es in Deutschland Cloud-technisch weiter nach oben geht. Allerdings gehe ich davon aus, dass die Zahl geschönt ist. Nein, nicht durch den Bitkom selbst, sondern weil immer noch nicht klar ist, was Cloud Computing wirklich bedeutet und die meisten befragten Unternehmen schlichtweg mit Ja geantwortet haben, obwohl es sich gar nicht um Cloud handelt. Unterstützung bei meiner These bekomme ich dabei von Forrester Research.

Umfrage-Ergebnisse des Bitkom

Das nun jedes dritte Unternehmen in Deutschland auf Cloud setzt, bedeutet etwa ein Wachstum von 9 Prozent im Vergleich zu 2011. Hinzu kommt, dass 29 Prozent planen Cloud Lösungen einzusetzen. Ein weiteres Drittel sieht Cloud Computing überhaupt nicht auf der Agenda. Laut der Umfrage haben aktuell 65 Prozent der Großunternehmen ab 2.000 Mitarbeitern Cloud Lösungen im Einsatz. Der Mittelstand zwischen 100 bis 1.999 Mitarbeitern kommt auf 45 Prozent. Kleinere Unternehmen mit 20 bis 99 Mitarbeitern decken ein Viertel ab.

Private Cloud wird bevorzugt

Dabei setzen 34 Prozent der befragten Unternehmen auf eigene Private Clouds. Im Vergleich zu 2011 sind das 7 Prozent mehr. 29 Prozent planen den Einsatz dieser Cloud-Form.

Wo wir zu meiner Behauptung kommen, dass die Aussage, das jedes dritte deutsche Unternehmen die Cloud nutzt, geschönt ist. Denn das was ich immer wieder höre und sehe, wird nun auch von Forrester Research, genauer James Staten, öffentlich dargelegt, der dies sogar als Cloud-Washing bezeichnet. 70 Prozent der „Private Clouds“ sind keine Clouds.

70 Prozent der „Private Clouds“ sind keine Clouds

Die Problematik besteht hauptsächlich darin, dass den meisten IT-Administratoren weiterhin das Verständnis dafür fehlt, was Cloud Computing, sei es nun Public oder Private Cloud, wirklich bedeutet. Wie James Staten schreibt, sind sich 70 Prozent befragter IT-Administratoren nicht darüber im klaren, was eine Private Cloud ist. Die meisten bezeichnen eine vollständig virtualisierte Umgebung bereits als Cloud, die überhaupt nicht über die Kerneigenschaften einer Cloud verfügt.

Virtualisierung ist kein Cloud Computing

Man muss das an dieser Stelle noch einmal verdeutlichen, dass die reine Virtualisierung einer Infrastruktur sie noch lange nicht zu einer Private Cloud macht. Virtualisierung ist eine Teilmenge des Cloud Computing und ein zentraler Bestandteil. Aber: Die Bereiche Self-Service, Skalierbarkeit, Ressourcen-Pools, Automatisierung, granulare Abrechnung, on-Demand Bereitstellung der Ressourcen usw. bietet keine gewöhnliche Virtualisierungslösung und liefert erst eine Cloud-Infrastruktur.

Erschreckenderweise ist auch mancher Anbieter so dreist und verkauft seine ehemaligen on-Premise Virtualisierungslösungen nun als Cloud. Das „Geständnis“ habe ich von einem Mitarbeiter von einem sehr großen US-amerikanischen Anbieter, der nun auch auf Cloud macht. Der Kontext in dem persönlichen Gespräch war ungefähr „Wir haben auf unsere angepassten VMware Lösungen einfach Cloud drauf geschrieben, um schnell etwas „Cloud-fähiges“ am Markt zu haben.

Deutsche Unternehmen glauben sie haben eine „Private Cloud“

Ähnlich sehe ich es auch bei den deutschen Unternehmen. Ich möchte dem Bitkom keine Vorwürfe machen. Schließlich muss er sich auf die korrekte Beantwortung der Fragen verlassen. Und was kann er dafür, wenn die Befragten aus Unwissenheit möglicherweise falsch antworten, indem sie behaupten eine Private Cloud zu nutzen, obwohl es sich dabei maximal um eine virtualisierte Infrastruktur ohne Cloud Eigenschaften handelt.

Mit diesem Hintergrund sollte man die Ergebnisse dieser Bitkom-Umfrage kritisch sehen, relativieren und eingestehen, dass eben nicht jedes dritte deutsche Unternehmen Cloud Computing nutzt.

Update: 12.03.13

Nicht das hier der Eindruck vermittelt wird, dass ich meine Behauptungen aus der Luft greife. Gestern wurde mir ein „Terminal-Server“ als Private Cloud „verkauft“. Argument: Es gibt ja viele Definitionen von Cloud, an denen man sich bedienen kann.

Update: 13.03.13

Auch werden gerne „Exchange-Server mit OWA als „Private Mail-Cloud“ bezeichnet.

Kategorien
Comment

Big data and cloud computing not only help Obama

In my guest post on the automation experts blog two weeks ago, I discussed the topic of big bata and how companies should learn from the U.S. election of Barack Obama in 2012, to convert real-time information in the lead. In addition to cloud computing, mobile and social media, big data is one to the current top issues in the IT business environment. This is by far not only a trend but a reality. With a far-reaching influence on business, its strategic direction and IT. Known technologies and methods on the analysis of big data have reached its limits. And only the company that manages to obtain an information advantage from the data silos is one step ahead to the competition in the future.

Big Data: No new wine in old wineskins

Basically the idea behind big data is nothing new. From the early to mid 1990s, there was already the term „business intelligence“ about using procedures to systematically analyze data. The results are used to gain new insights that will help in achieving the objectives of a company better and to make strategic decisions. However, the data base, which had to be analyzed, was much smaller than today and only analyzes on data from the past could made, leading to uncertain forecasts for the future. Today everyday objects collect data with massive amounts of information every second. This includes smartphones, tablets, cars, electricity meters or cameras. There are also areas that are not located in the immediate vicinity of a person, such as fully automated manufacturing lines, distribution warehouse, measuring instruments, aircraft and other means of transport. And of course it is we humans who nurture big data with our habits. Tweets on Twitter, comments on Facebook, Google search queries, browsing with Amazon and even the vital signs during a jogging session provide modern companies vast amounts of data today which can turn in valuable information.

Structured and unstructured data

Large data sets are not a new phenomenon. For decades, retail chains, oil companies, insurance companies or banks collect solid information on inventories, drilling data and transactions. This also includes projects for parallel processing of large data sets, data mining grids, distributed file systems and databases, distributed to the typical areas of what is now known as big data. These include the biotech sector, projects of interdisciplinary scientific research, weather forecasting and the medical industry. All of the above areas and industries are struggling with the management and processing of large data volumes.

But now the problem has also affected the „normal“ industries. Today’s challenges are that data arise from many different sources and sometimes fast, unpredictable and, lest unstructured. Big data is to help in places where a lot of different data sources may be combined. Examples are tweeting on Twitter, browsing behavior or information about clearance sales and to this understanding to develop new products and services. New regulations in the financial sector lead to higher volumes of data and require better analysis. In addition, web portals like Google, Yahoo and Facebook collect an enormous amount of daily data which they also associate with users to understand how the user moves to the side and behaves. Big data becomes a general problem. According to Gartner, enterprise data could grow in the next five years by up to 650%. 80% of those will be unstructured data or big data that have already shown that they are difficult to manage. In addition, IDC estimates that the average company has to manage 50 times more information by 2020, while the number of IT staff will increase by only 1.5%. A challenge companies must respond to in an efficient manner if they try to remain competitive.

Why companies choose big data

But where do these huge amounts of data come from and what motivates a business to deal with the issue. Market researchers of Experton Group tried to clarify the questions in their „Big Data 2012 – 2015“ client study in October 2012. Accordingly, the main driver for the use of big data technologies and concepts is the rapid growth of data, including the appropriate quality management and automation of analysis and reporting. The topics of loyalty and marketing take about a third of the companies as an opportunity to renew the analysis of their databases. New database technologies give companies 27 percent of respondents as a motivation for new methods of data analysis. Furthermore almost all the features of big data matter the reasons for the expansion of strategic data management. This shows that big data is already reality, even though in many cases it is not known by this term. The big data drivers themselves are the same across all industries and company sizes across. The only difference is in the meaning and intensity. One big difference is the size of the company and the distribution of data and information to the right people in the company. Here companies see their biggest challenges. Whereas smaller companies classify the issue as uncritical.

Big data: An use case for the cloud

The oil and gas industry has solved the processing of large amounts of data through the use of traditional storage solutions (SAN and NAS). Research-oriented organizations or companies like Google, which have to do with the analysis of mass data are more likely to keep track of the Grid approach to invest the unused resources in software development.

Big data processing belongs to the cloud

Cloud infrastructures help to reduce costs for the IT infrastructure. This alows company to be able to focus more effectively on their core business and gain greater flexibility and agility for the implementation of new solutions. Thus a foundation is laid, to adapt to the ever-changing amounts of data and to provide the necessary scalability. Cloud computing providers are capable based on investments in their infrastructure, to develop a big data usable and friendly environment and maintain these. Whereas a single company can’t provide the adequate resources for scalability and also does not have the necessary expertise.

Cloud resources increasing with the amount of data

Cloud computing infrastructures are designed to grow or reduce with the demands and needs. Companies can meet the high requirements – such as high processing power, amount of memory, high I / O, high-performance databases, etc. – that are expected from big data, easily face through the use of cloud computing infrastructure without investing heavily in their own resources.

Cloud concepts such as infrastructure-as-a-service (IaaS) combine both worlds and take in a unique position. For those who understand the SAN / NAS approach, resources can also be use to design massively parallel systems. For companies who find it difficult to deal with the above technologies or understand this, IaaS providers offer appropriate solutions to avoid the complexity of storage technologies and to focus on the challenges facing the company.

An acceptable solution comes from cloud computing pioneer Amazon Web Services. With the AWS Data Pipeline (still in beta) Amazon offers a service which move and handle data automatically between different systems. The systems are to be either directly in the Amazon cloud or on an other system outside. Amazon makes the handling of the growing amounts of data to distributed system with different formats easier. To this number of pipelines, in which the different data sources, conditions, objectives, instructions and schedules are defined can be created. In short, it’s about what data is loaded from which system based on which conditions, then be processed, and afterall where the results should be save. The pipeline will be started as needed, hourly, daily or weekly. The processing can take place either directly in the Amazon cloud or on the systems in the company’s own data center.

Big Data = Big Opportunities?

Not only the Obama example shows how profitable the operation of structured and unstructured data from mobile devices, social media channels, the cloud, and many other different sources of a company can be. However, one has to be clear about one point regarding big data. It is ultimately not the mass of the data that is collected, but the quality and for which the data is to be ultimately used in general.

It is therefore crucial whether and how a company manages the masses of data generated by human and machine interactions and to analyze the highest-quality of information and thus secures a leading position in the market. Qualified data is the new oil and provides companies that recognize their own advantage therein, the lucrative drive.

Kategorien
Comment

The Windows Azure Storage outage shows: Winner crowns have no meaning

The global outage of Windows Azure Storage due to an expired SSL certificate again has struck big waves on Twitter. The ironic aftertaste is particularly due to the fact that Windows Azure was recently chosen by Nasuni as the „Leader in Cloud Storage“. Especially because of performance, availability and write errors Azure could stick out.

Back to business

Of course, the test result has been rightly celebrated on the Windows Azure team’s blog (DE).

„In a study of the Nasuni Corporation various cloud storage providers such as Microsoft, Amazon, HP, Rackspace, Google were compared. The result: Windows Azure could reap in almost all categories (including performance, availability, errors) the winners crown. Windows Azure is there clearly shown as a „Leader in Cloud Storage“. Here is an infographic with the main findings, the entire report is here.“

However, the Azure team was brought back to earth with a bump due to the really heavy outage which could be felt around the world. They should rather go back to the daily business and ensure that an EXPIRED SSL CERTIFICATE never lead to such a failure. Because this has relativized the test result and thus shows that winner crowns have no meaning at all. Very surprising is that always evitable errors lead to such catastrophic outages. Another example is the problem with the leapyear 2012.

Kategorien
Kommentar

Der Windows Azure Storage Ausfall zeigt: Siegerkronen haben keine Bedeutung

Der weltweite Ausfall von Windows Azure Storage auf Grund eines abgelaufenen SSL-Zertifikat hat auf Twitter erneut große Wellen geschlagen. Der ironische Beigeschmack kommt insbesondere dadurch zustande, dass Windows Azure erst vor kurzem durch Nasuni zum „Leader in Cloud Storage“ gewählt wurde. Besonders durch Performance, Verfügbarkeit und Schreibfehler konnte Azure glänzen.

Zurück zum Tagesgeschäft

Natürlich wurde das Test-Ergebnis auf dem Team eigenen Windows Azure Blog zurecht gefeiert.

„In einer Untersuchung der Nasuni Corporation wurden verschiedene Anbieter von Cloud Storage wie Microsoft, Amazon, HP, Rackspace, Google vergliechen. Das Ergebnis: Windows Azure konnte in nahezu allen Kategorien (unter anderem Performance, Verfügbarkeit, Schreibfehler) die Siegerkrone einheimsen. Windows Azure geht dort klar als „Leader in Cloud Storage“ hervor. Eine Infografik mit den wichtigsten Ergebnissen gibt es hier, den gesamten Report hier.“

Allerdings wurde das Azure Team durch den wirklich schweren Ausfall, der weltweit zu spüren war, wieder auf den Boden der Tatsachen zurückgeholt und sollte sich nun lieber wieder dem Tagesgeschäft widmen und dafür sorgen, dass ein ABGELAUFENES SSL-ZERTIFIKAT nicht wieder zu so einem Ausfall führt. Denn dieser hat das Test-Ergebnis somit relativiert und zeigt, dass Siegerkronen überhaupt keine Bedeutung haben. Sehr verwunderlich ist, dass bei Microsoft immer vermeidbare Fehler zu solchen schweren Ausfällen führen. Ein weiteres Beispiel ist das Problem mit dem Schaltjahr 2012.

Kategorien
News @en

Microsoft Windows Azure Storage experienced a global(!) outage due to an expired SSL certificate

Since Friday, 22. February, 12:44 PM PST Microsoft Windows Azure Storage suffers massive problems worldwide(!). This happened due to an expired SSL certificate. The outage has also affected Azure Storage dependent services (see screenshot). Currently, Microsoft is working to restore all services. First test deployments seem to show success. More to find on the Windows Azure Service Dashboard.

Microsoft Windows Azure Storage experienced a global(!) outage due to an expired SSL certificate

Pity, since Windows Azure Storage has recently won a Nasuni performance competition against Amazon S3.

Humor on

Humor off

Kategorien
News

Ein abgelaufenes SSL-Zertifikat zwingt Microsoft Windows Azure Storage weltweit(!) in die Knie

Seit Freitag, dem 22. Februar, 12:44 PM PST leidet Microsoft Windows Azure Storage weltweit(!) unter massiven Problemen. Grund hierfür ist ein abgelaufenes SSL-Zertifikat. Der Ausfall hat ebenfalls von Azure Storage abhängige Services in Mitleidenschaft gezogen (siehe Screenshot). Derzeit arbeitet Microsoft an der Wiederherstellung aller Services. Erste Test Deployments zeigen anscheinend Erfolg. Mehr unter dem Windows Azure Service Dashboard.

Ein abgelaufenes SSL-Zertifikat zwingt Microsoft Windows Azure Storage weltweit(!) in die Knie

Schade, wo Windows Azure Storage erst kürzlich einen Nasuni Performance Wettkampf gegen Amazon S3 gewonnen hat.

Humor an

Humor aus