Kategorien
Analysis

Cloud Computing Myth: Less know-how is needed

I recently found an interesting statement in an article which describes the advantages of cloud computing. A caption was named „It’s not needed to build additional know-how within the company“. This is totally wrong. On the contrary it’s exactly the opposite. It is more knowledge needed as each vendor is promising.

There is a lack of knowledge

The kind and amount of the needed knowledge depends on the service that is used from the cloud. For an alleged high-standardized software-as-a-service (SaaS) application like e-mail there is less knowledge needed regarding the service and its characteristics compared to a service, which maps a certain process.

For the use of infrastructure-as-a-service (IaaS) or platform-as-a-service (PaaS) it looks quite different. Although the provider takes care about building, operating and maintaining the physical infrastructure in this case. The responsibility for the virtual infrastructure (IaaS) behooves by the customer. The provider itself – for fee-based support – or certified system integrator help here by building and operating. It’s the same while operating an own application on a cloud infrastructure. The cloud provider is not responsible and just serves the infrastructure and means and ways with APIs, web interfaces and sometimes added value services, to help customers with the development. In this context one need to understand, that depending on the cloud scale – scale out vs. scale up – the application have to be developed completely different – namely distributed and automatically scalable across multiple systems (scale out) – as in the case of a non-cloud environment. This architecturally knowledge is lacking in most companies at every nook and corner, which is also due to the fact that colleges and universities have not taught this kind of programmatic thinking.

Cloud computing is still more complex than it appears at first glance. The prime example for this is Netflix. The U.S. video-on-demand provider operates its platform within a public cloud infrastructure (Amazon AWS). In addition to an extensive production system, which ensures the scalable and high-performance operation, it also has developed an extensive test suite – the Netflix Symian Army – which is only responsible for ensuring the smooth operation of the production system – inter alia virtual machines are constantly arbitrarily being shot down, but the production system must still continue to function properly.

The demand for the managed cloud rises

Although, the deployment model can less reduce the complexity, but the responsibility and the necessary know-how can be shifted. Within a public cloud the self-service rules. This means that the customer is first 100 percent on his own and is solely responsible for the development and operation of his application.

This, many companies have recognized and confess to themselves that they either do not have the necessary knowledge, staff and other resources to successfully use a public cloud (IaaS). Instead, they prefer or expect help from the cloud providers. In these cases it is not about public cloud providers, but managed cloud/ business cloud providers who also help in addition to infrastructure with professional services.

Find more on the topic of managed cloud at „The importance of the Managed Cloud for the enterprise„.

Kategorien
Analysis

Dropbox alternatives for the enterprise IT

The popularity of easy of use cloud storage services like Dropbox to cause IT decision makers quite a headache. Withal, the market already offers enterprise ready solutions. This article introduces cloud services for the professional use.

Dropbox drives shadow IT

Dropbox has driven cloud storage services into the enterprise. The fandom of the US provider extends from the ordinary employee up to the executive floor. In particular, the fast access, the ease of use on each device and the little costs made Dropbox to an attractive product. But what sounds like a true success story at first, is in reality a serious problem for CIOs and IT manager. Dropbox has led to a new form of shadow IT. Meant, here is the widely uncontrolled growth of IT solutions, employees and departments use without taking care of the IT department, purchasing these using credit cards. Behind this mostly stands the criticism internal IT departments are not able to deliver suitable solutions fast and in a desired quality. This leads to situations, where company data are stored on private Dropbox accounts, where they do not have to belong.

The Dropbox boom and the easy access to public cloud services in general led to a discussion about the right to exist of traditional IT departments. Sooner or later they could die out some analysts predict. Then the IT strings are in the hand of the Line of Business Manager (LOB). Yet, the reality looks different: In particular, the often anxious LOB Manager have normally neither the time nor the knowledge, to make such IT decisions. They indeed know what is important for their area, but do they have the knowledge, which systems also have to play together? For many years companies fight with not ideal integrated isolated applications and data silos. Public cloud solutions exponentiate this problem and Dropbox is just the tip of the iceberg.

To get the Dropbox phenomenon under control several vendor of enterprise cloud storage have established in the past years. The widely used Dropbox service offers by far not what typical enterprise policies and IT governance models demand.

Dropbox for Business

Since 2011 „Dropbox for Business“, a corporate offer with advanced features for more safety, team management and reporting capabilities, exists. However, the solution does not have the breadth and variety of functions like other similar offers on the market. Therefore, Dropbox is more suited for small and familiar teams that do not require as much control as larger companies. For $795 per year for five users unlimited space is available. Each additional user cost $125 per year.

Administrators get access over a dashboard to information about the activities of their users. This includes the used devices, browser sessions and applications. Here it is also possible to close browser sessions, disconnect devices and disable third-party apps.

For improved security, various authentication mechanisms can be activated, including a two-factor authentication. There is also a single sign-on (SSO) integration with Active Directory and other SSO providers. For the technical infrastructure Dropbox uses Amazon S3. This means that the data is stored in one of the global Amazon data centers. Although these data centers meet high safety standards as SSAE16, ISAE 3402 and ISO 27001. However, Dropbox does not guarantee a specific location of the data within the Amazon Cloud, like a data center in the EU. Dropbox indicates that the data is encrypted with AES 256-bit before it is stored on Amazon S3. However, Dropbox has plain text access to user files. A separate encryption is only possible with external tools.

Another deficit is the lack of audit mechanisms at file level and activities of the user. It is not possible to centrally look into a single user account, or to look for an earlier version of the file. This only works if one register as a user to look into the data. In addition, the reports provide no information about user activities such as uploading and sharing of files – a big gap in the audit process.

Strengths

  • Ease of use.
  • Supports the major operating systems.
  • Big market share and acceptance in consumer space.
  • Unlimited storage space at an attractive price.

Weaknesses

  • Dropbox has full plain text access to user files.
  • No end-to-end encryption.
  • Data encryption using external tools.
  • Weak reporting.
  • Insufficient administration and audit options.
  • Location of the data can not be set.

Box

Box is one of the well-known providers of public cloud enterprise storage and targets its functions to small and medium-sized as well as large companies.The business plan is $15 per user per month for 3 to 500 users. This includes 1,000 GB of storage space. Box for Enterprise IT offers an unlimited number of users and unlimited disk space, the prices are obtained on request.

Clients for common desktop and mobile operating systems allow synchronization and uploading of data with almost any device. Files can be locked and automatically be released after time. In addition, depending on the plan, the version history is stored between 25 to 100 files. Other functions allow external authentication mechanisms, user management and auditing capabilities. The enterprise plan offers further management functions and access to APIs.

Depending on the plan more functions open. This can be particularly well seen on the permissions level. The higher the plan, the more types of users and access rights can be assigned to an object. Business and enterprise customers also get detailed reporting capabilities. These include, among other things, information on who has viewed and modified which files. Other safety features Box offers with authentication mechanisms for Active Directory, Salesforce, NetSuite, Jive and DocuSign and single sign-on (SSO) integration capabilities. In terms of data center capacity Box cooperates with Equinix. Among others, there is a data center in Amsterdam for the European market. Where Equinix has no sites, Box relies on Amazon Web Services.

Box ‚biggest weakness is the limitation on 40,000 objects for files and folders. This restrictions customers have already pointed out in mid-2012. So far, nothing has changed. There is only the information that the limit is raised to 100,000 objects in „Box Sync 4“.

Strengths

  • Ease of use.
  • Variety of extensions.
  • Supports the major operating systems.
  • Many relevant features for business (management, audit, etc).

Weaknesses

  • Files and folders are limited to 40,000 objects.
  • Encryption codes are owned by Box.

TeamDrive

TeamDrive from Hamburg is a file sharing and synchronization solution. It is intended for companies that do not want to save their sensitive data at external cloud services, but still want to allow their teams to synchronize data or documents. For this TeamDrive monitors any folder on a PC, laptop or smartphone that can be used and edited together with invited users. Thus, data is also offline available at all times. An automatic synchronization, backup and versioning of documents protect users against data loss. With the possibility to operate TeamDrive registration and hosting server in an own data center, the software can be integrated into existing IT infrastructures. For this reason all necessary APIs are available. For TeamDrive Professional enterprise customers pay 5.99 euros per user per month, or 59.99 euros per year.

Using the global TeamDrive DNS service several independently operated TeamDrive systems can be linked together. If necessary, this allows customers to build a controlled community cloud in a hybrid scenario.

TeamDrive offers many business-related functions for the management and control of a storage service. These include a rights management on Space-level for different user groups, as well as a version control system to access older versions of documents and changes of group members. For the synchronization of the data, clients for all major local and mobile operating systems are available, including Windows, Mac, Linux, iOS and Android. With TeamDrive SecureOffice, the vendor has also brought an expansion of its mobile clients on the market, with which documents can be processed within an end-to-end encryption. An integrated mobile device management (MDM) helps to manage all devices used with TeamDrive. These can be added, blocked or erased. TeamDrive can be bound to existing directory services such as Active Directory and LDAP to synchronize the user administration.

In addition to these management functions TeamDrive features a fully integrated end-to-end encryption where the encryption keys are exclusively owned by the user. Thus, TeamDrive is not able to access the data at no time. For encryption, TeamDrive relies on AES 256 and RSA 3072

It should also be mentioned that TeamDrive, as the only enterprise storage solution, carries the privacy seal by the Independent Centre for Privacy Protection Schleswig-Holstein (ULD). The privacy seal confirms that TeamDrive is suitable for the use in businesses and governments for the confidential exchange of data.

Strengths

  • End-to-end encryption.
  • Different encryption mechanisms.
  • SecureOffice for mobile secure processing of documents.
  • Certification by the ULD.
  • Integrated mobile device management.
  • Many relevant functions for businesses.

Weaknesses

  • No locking of files.
  • No browser access.

Microsoft SkyDrive Pro

SkyDrive Pro is Microsoft’s enterprise cloud storage, which is provided in conjunction with SharePoint Online and Office 365. The service is exclusively designed for business purposes and therefore should be different from SkyDrive. SkyDrive is aimed at home users who should predominantly store and share documents and photos in the Microsoft cloud. The management of SkyDrive Pro is in the responsibility of a company. Employees should store, share, and collaborate business documents with colleagues within a private domain.

SkyDrive Pro is fully synchronized with SharePoint 2013 and Office 365. An administrator decides how the libraries can be used within SkyDrive Pro for each user. For this purpose, different access rights for users and user groups can be assigned. Using a client documents can be synchronized with the local computer. Mobile clients are available for iOS and Windows Phone. Android and Blackberry are currently not supported.

Documents or entire folders can be shared with individual colleagues or distribution lists. Access rights can be assigned for read or write access. A recipient then receives an e-mail including the comment and the link to the document and can follow it to get change information later. Sharing with partners and customers outside the domain is possible if the company supports external sharing.

According to Microsoft, all data in SkyDrive Pro will be protected with several layers of encryption. The only way to get the information, is if an administrator granted access rights to it. Furthermore, Microsoft guarantees that the private corporate data is protected from search engines so that no meta-data is collected in any form. In addition, SkyDrive Pro is compliant with HIPAA, FISMA and other data protection standards.

Strengths

  • Integration with Office 365 and SharePoint.
  • Clients for mobile operating systems.

Weaknesses

  • Proprietary Microsoft system.
  • European data center only (Dublin, Amsterdam).
  • No Android client.

Amazon S3

Over a web service, Amazon S3 (Amazon Simple Storage Service) provides the access to an unlimited amount of storage in the Amazon cloud. Unlike to competing cloud storage services the storage can only be accessed via a REST and SOAP interface (API). Amazon does not provide an own local client for synchronization. This is due to the fact, that Amazon S3 basically serves as a central storage location, many other Amazon services use to store or retrieve data. Here an ecosystem of partners help with paid clients to make use of synchronization capabilities with desktop and mobile operating systems. Using the own Amazon AWS Management Console, folders and files can be accessed via the web interface.

With the API, data as objects can be stored, read, and deleted in the Amazon Cloud. The maximum size of an object is 5 GB. Objects are organized in buckets (folders). Authentication mechanisms ensure that the data is protected from unauthorized third parties. For this purpose, objects can be marked for private or public access and assigned with different user access rights to the objects.

Amazon S3 pricing varies by region in which the data is stored. One GB of storage used for the first TB in the EU region cost 0,095 U.S. dollars per month. In addition, the outgoing data transfer is charged. Up to 10 TB per month the traffic costs $0.12 per GB.

Many other cloud storage services use Amazon S3 to store the user data, including Dropbox, Bitcasa or Ubuntu One.

Strengths

  • The API is the de facto standard in the market.
  • Very high scalability.
  • Very good track record.

Weaknesses

  • No own clients.
  • The pay-per-use model requires strict cost control.

ownCloud

Like TeamDrive, ownCloud is a file sharing and synchronization solution. It is aimed at companies and organizations that want to keep their data under control and not to rely on external cloud storage services. The core of the application is the ownCloud server. This allows to integrate the software along with the ownCloud clients seamlessly into the existing IT infrastructure. In addition, the server enables the use of existing IT management tools. ownCloud serves as a local directory which mounts different local storages. Thus, the files are available to all employees on all devices. In addition to a local storage, directories can be connected via NFS and CIFS.

The ownCloud functions form a set of add-ons that are directly integrated into the system. These include a file manager, a contact manager and extensions to OpenID, WebDAV and a browser plugin for viewing of documents such as ODF and PDF. Other applications for enterprise collaboration are available on ownCloud’s own marketplace. Files can be uploaded using a browser or synchronized with clients for local and mobile operating systems.

Security is provided via a plugin for the server-side encryption, but which is not enabled by default. Is the plugin enabled, the files are encrypted when they are stored on the server. Here, only the contents of the files, the file names themselves are not encrypted. In addition ownCloud relies exclusively on security „at rest“.

The biggest advantage of ownCloud is also its disadvantage. The control over the data, which a company recovers through the use of ownCloud, on the other hand causes costs for the setup and operation. Administrators need to have enough knowledge about the operation of web servers such as Apache, but also about PHP and MySQL to successfully run ownCloud. In addition, a meticulous configuration is needed, without the expected performance of an ownCloud installation can not be reached.

Strengths

  • Open source.
  • Variety of applications.
  • Clients support the major operating systems.

Weaknesses

  • Weak security and encryption.
  • High costs for the operation of an own ownCloud infrastructure.
Kategorien
Analysis

My cloud computing predictions for 2014

The year 2013 is coming to an end and it is once again time to look into the crystal ball for the next year. To this end, I’ve picked out ten topics that I believe will be relevant in the cloud area.

1. The AWS vs. OpenStack API discussion will never end

This prediction is also valid for the years 2015, 2016, etc. It sounds like fun at first. However, these discussions are annoying. OpenStack has more important issues to address than the never-ending comparisons with Amazon Web Services (AWS). Especially since the troubles constantly come from outside, that does not help. If the OpenStack API is supposed to be 100% compatible with the AWS API, then OpenStack can be renamed in Eucalyptus!

The OpenStack community must find its own way and make it possible for service providers, away from the industry leader AWS, to be able to build up their own offerings. However, this mammoth task is again only in the hands of the provider. Eventually, OpenStack is just a large construction kit for building cloud computing infrastructures. What happens at the end (business model) is not the task of the OpenStack community.

Read more: „Caught in a gilded cage. OpenStack providers are trapped.

2. Security gets more weight

Regardless of the recent NSA scandal, the security and confidence of (public) cloud providers have always been questioned. In the course of this (public) cloud providers are better positioned to provide desired data protection than most of the world’s businesses. This is partly due to the extensive resources (hardware, software, data center and staff) that can be invested in the development of strong security solutions. In addition, as data protection is part of (public) cloud providers’ core business, its administration by the provider can ensure smoother and safer operations.

Trust is the foundation of any relationship. This applies to both private and business environments. The hard-won confidence of the recent years is being put to the test and the provider would do well to open further. „Security through obscurity“ is an outdated concept. The customers need and want more clarity about what is happening with their data. This will vary depending on the world’s region. But at least in Europe, customers will continually put their providers to the test.

Read more: „How to protect companies’ data from surveillance in the cloud?

3. Private cloud remains attractive

The private cloud is financially and in terms of resource allocation (scalability, flexibility) not the most attractive form of cloud. Nevertheless, it is, despite the predictions of some market researchers, not losing its significance. On the contrary, here, despite the cost pressures, the sensitivity of the decision makers who want retain control and sovereignty over the data and systems is underestimated. This is also reflected in recent figures from Crisp Research. According to this, only about 210 million euros were spent on public infrastructure-as-a-service (IaaS) in Germany in 2013. By contrast, investment for private cloud infrastructure has exceeded 2.3 billion euros.

A recent study by Forrester also shows:

„From […] over 2,300 IT hardware buyers, […] about 55% plan to build an internal private cloud within the next 12 months.“

This does not mean that Germany has to be the land of private clouds. Finally, it’s always about the specific workload and use case, which is running in a private or public cloud. But there is clear trend towards a further form of cloud computing usage in the future.

4. Welcome hosted private cloud

After Salesforce has also jumped on the train, it is clear that enterprise customers can not make friends with the use of a pure public cloud. In cooperation with HP, Salesforce has announced a dedicated version of its cloud offering – „Salesforce Superpod“.

This extreme change in Salesforce strategy confirms our results from the exploration of the European cloud market. Companies are interested in cloud computing and its properties (scalability, flexibility or pay-per-use). However, they admit to themselves that they do not have the knowledge and time, nor is it their core business to operate IT systems; they instead let the cloud provider take care of this, expecting a flexible type of managed services. Public cloud providers are not prepared for this, because their business is to provide highly standardized infrastructure, platforms, applications and services. The remedy is that the provider secures specially certified system integrators alongside.

Here is an opportunity for providers of business clouds. Cloud providers that do not offer a public cloud model, can on the basis of managed services help the customer find the way to master the cloud and can take over the operations, supplying an “everything from one source” service offer. This typically does not happen on shared infrastructure but within a hosted private cloud or a dedicated cloud where a customer is explicitly in an isolated area. Professional services round off the portfolio with integration, interfaces and development. Due to the exclusivity and higher safety (usually physical isolation), business clouds are more expensive than public clouds. However, considering the effort that a company must make to operate itself in one or another public cloud to actually be successful or in getting the help of a certified system integrator, then the public cloud cost advantage is almost eliminated.

5. Hybrid cloud remains the continuous burner

The hybrid cloud is always a hot topic during the discussions about the future of the cloud. But it is real. Worldwide, the public cloud is initially mainly used to get access to resources and systems easily and conveniently in the short term. In the long run, this will form into a hybrid cloud, by IT departments that bring their own infrastructure to the level of a public cloud (scalability, self-service, etc.) to serve their internal customers. In Germany and Europe, it is exactly the opposite. Here private clouds are preferred, because the topics privacy and data security have a top priority. Europeans must and will get used to the public cloud to connect certain components – even some critical systems – to a public cloud.

The hybrid cloud is about the mobility of the data. This means that the data are rather held locally in an own infrastructure environment, shifted to another cloud for processing, e.g. public cloud, then being placed back within the local infrastructure. This must not always refer to the same cloud of a provider, but depending on the cost, service level and requirements more clouds are involved in the process.

6. Multi-cloud is reality

The topic of multi-cloud is currently highly debated, especially in IaaS context, with the ultimate goal being to spread the risk and take advantage of the costs and benefits of different cloud infrastructures. But even in the SaaS environment the subject must necessarily become of greater importance to avoid data silos and isolated applications in the future, to simplify the integration and to support companies in their adaptation of their best-of-breed strategies.

Notwithstanding these expectations, the multi-cloud use is already reality in companies using multiple cloud solutions from many different vendors, even if the services are not yet (fully) integrated.

Read more: „Multi-Cloud is “The New Normal”„.

7. Mobile cloud finally arrives

Meanwhile, there are providers who have discovered the mobile cloud slogan for their marketing. Much too late. The fact is that, since the introduction of the first smartphones, we are living in a mobile cloud world. The majority of all data and information, which we can access from smartphones and tablets, are no longer on the local device but on a server in the cloud.

Solutions such as Amazon WorkSpaces or Amazon AppStream support this trend. Even if companies are still careful with the outsourcing of desktops, Amazon WorkSpaces will strengthen this trend, from which also vendors such as Citrix and VMware will benefit. Amazon AppStream underlines the mobile cloud to the full extent by graphic-intensive applications and games that are processed entirely in the cloud and only streamed to the device.

Read more: „The importance of mobile and cloud-based ways of working continues to grow„.

8. Hybrid PaaS is the top trend

The importance of platform-as-a-service (PaaS) is steadily increasing. Observations and discussions with vendors that have found no remedy against Amazon AWS in the infrastructure-as-a-service (IaaS) environment so far show that they will expand their pure compute and storage offerings vertically with a PaaS and thus try to win the favor of the developer.

Other trends show that private PaaS and hosted private PaaS try to gain market share. With the Application Lifecycle Engine, cloudControl has enclosed its public PaaS in a private PaaS that enterprises can use to operate an own PaaS in a self-managed infrastructure. In addition, a bridge to a hybrid PaaS can be spanned. IaaS provider Pironet NDH has adapted the Windows Azure Pack from Microsoft to offer on this basis an Azure PaaS in a hosted private environment. This is interesting, since it closes the gap between a public and a private PaaS. With a private PaaS companies have the complete control over the environment, but they also need to build and manage it. Within a hosted version, the provider takes care of it.

9. Partner ecosystems become more important

Public cloud providers should increasingly take care to build a high quality network of partners and ISVs to pave the way for customers to the cloud. This means that the channel should be strongly embraced to address a broader base of customers who are potential candidates for the cloud. However, the channel will struggle with the problem of finding enough well-trained staff for the age of the cloud.

10. Cloud computing becomes easier

In 2014 more offerings will appear on the market, making the use of cloud computing easier. With these offers, the „side issues“ of availability and scalability need to be given less attention. Instead, cloud users can focus on their own core competencies and invest their energy into developing the actual application.

An example: Infrastructure-as-a-service market leader Amazon AWS says that IT departments using the AWS public cloud no longer need to take care about the procurement and maintenance of the necessary physical infrastructure. However, the complexity has shifted to a higher level. Even if numerous tutorials and white papers already exist, AWS does not stress upon the fact that scalability and availability within a scale-out cloud infrastructure can be arbitrarily complicated. These are costs that should never be neglected.

Hint: (Hosted) Community Clouds

I see a growing potential for the community cloud (more in German), which currently has no wide distribution. In this context, I also see a shift from a currently pronounced centralization to a semi-decentralized nature.

Most companies and organizations see in the public cloud a great advantage and want to benefit in order to reduce costs, to consolidate IT and equally get more scalability and flexibility. On the other hand, future-proof, trust, independence and control are important “artifacts” that no one would like to give up.

The community cloud is the medium to achieve both. It combines the future-proof, trust, independency and control characteristics with the benefits of a public cloud that come from a real cloud infrastructure.

Some solutions that can be used as a basis for creating own professional clouds already exist. However, one should always keep close watch on the basic infrastructure that forms the backbone of the entire cloud. In this context, one should not underestimate the effort it takes to build, properly operate and maintain a professional cloud environment. Furthermore, the cost of the required physical resources needs to be calculated.

For this reason, for many small and medium-sized enterprises, as well as shared offices, co-working spaces or regular project partnerships, it makes sense to consider the community cloud. This type of cloud environment will offer the benefits of the public cloud (shared environment) while providing the future-proof, trust, independence and control requirements, alongside cost advantages that can be achieved among others by dynamic resource allocation. For this purpose, one should think about building a team that exclusively takes care of the installation, operations, administration and maintenance of the community cloud. In this case, the availability of own data center or a co-location is not a prerequisite. Instead, an IaaS provider can serve as an ideal partner for a hosted community cloud.

Kategorien
Analysis

Criteria for selecting a cloud storage provider

Who is searching for secure and enterprise ready options for Dropbox should have a closer look to the vendors. The quest for a cloud storage vendor depends in most cases on the individual requirements. These decision makers previously need to debate and define. In particular, this includes classifying the data. Here is defined which data is stored in the cloud and which is still located in an own on premise infrastructure. During the selection of a cloud storage vendor companies should regard the following characteristics.

Configuration and integration

The storage service should be able to integrate in existing or further cloud infrastructure in a simple manner. Thus users are empowered to expand the existing storage through a hybrid scenario cost-efficient. In addition, data can be migrated from the local storage into the cloud in a self-defined period. This leads to the option to disclaim an own storage system for specific data in the long run. It is the same with the straightforward and seamless export of data from the cloud that needs to be ensured.

A further characteristic is the interaction of the cloud service with internal systems like directory services (Active Directory or LDAP) for a centralized collection of data providing to applications. For an easy and holistic administration of user access to the storage resources this characteristic is mandatory. For this, the vendor should provide an open and well documented API to realize the integration. Alternatively he can also deliver a native software.

Platform independence to access data from everywhere

The mobility for the employees become more and more important. For companies it is of vital importance to appoint their working habits and deliver appropriate solutions.

In the best case the cloud provider should enable a platform independent access to the data by providing applications for all common mobile and local operating systems as well as an access over a web interface.

Separation of sensitive and public data

To give employees data access over mobile and web applications further security mechanisms like DMZs (demilitarized zone) and right controls on granular file level are necessary. A cloud storage provider should have functions to separate data with a higher security demand from public data. Companies who want to provide the data from an own infrastructure need to invest in further security systems or find a vendor who has integrated these type of security.

Connection to external cloud services

A cloud storage can be used as a common and consistent data base for various cloud services to integrate services like software-as-a-service (SaaS) or platform-as-a-service (PaaS). The cloud storage serves as a central storage. For this purpose the vendor needs to provide an open API to realize the connectivity.

Cloud storage – Eco- and partner system

Especially for storage vendors who exclusively dispose cloud solutions, a big ecosystem of applications and services is attractive and important to expand the storage service with further value added functions. This includes, for example, an external word processor to edit documents within the storage with multiple colleagues.

Size of the vendor – national and international

The track record is the most important evidence for the past success giving a statement about the popularity based on well-known customer and succeeded projects. This aspect can be considered for a national as well as an international footprint. Besides its available capacity and therefore its technology size, for a cloud storage vendor the international scope is also vital importance. If a company wants to enable its worldwide employees to access a central cloud storage, but decides for a vendor who just have data centers in the US or Europe, not only the latency can lead to problems. Insofar the scalability regarding the storage size as well as the scope are a crucial criteria.

In addition, it is interesting to look at the vendor’s roadmap: What kind of changes and enhancements are planned for the future? Are these enhancements interesting for the customer compared to another potential vendor who does not consider this?

Financial background

A good track record is not the only reason while choosing a vendor. Not least the drama of smashup storage vendor Nirvanix has shown that the financial background must be considered. Especially during the risk assessment a company should take a look on the vendor’s current financial situation.

Location and place of jurisdiction

The location where the company data is stored becomes more and more important. The demand for the physical storage of the data in the own country increasingly rises. This is not a German phenomenon. Even the French, Spain or Portuguese expect their data stored in a data center in the own country. (http://research.gigaom.com/report/the-state-of-europes-homegrown-cloud-market/) The Czechs prefer a data center in Austria instead of Germany. More relaxed are the Netherlands on this topic. Thereby the local storage of the data is basically not a guarantee for the legal compliance of the data. However, it becomes easier to apply local laws.

Most of the US vendor cannot fulfill a physical locality of the data in each European country. The data centers are either located in Dublin (Ireland) or Amsterdam (Netherlands) and just comply with European law. Although many vendors joined Safe Harbor which allows to legally transfer personal data into the US. However, it is just a pure self-certification that based on the NSA scandal is challenged by the Independent Regional Centre for Data Protection of Schleswig-Holstein (Unabhängiges Landeszentrum für Datenschutz Schleswig-Holstein (ULD)).

Cloud storage – Security

Regarding the topic of security it is mostly all about trust. This, a vendor only achieves with openness. He needs to show his hands to his customers technologically. Especially IT vendors are often criticize when it’s about talking on their proprietary security protocols. Mostly the critics are with good cause. But there are also vendors who willingly talk about it. These companies need to be find. Besides the subjective topic of trust it is in particular about the implemented security which is playing a leading role. Here it’s important to look on the current encryption mechanism a vendors is using. This includes: Advanced Encryption Standard – AES 256 for encrypting the data, Diffie-Hellman and RSA 3072 for key exchange.

Even the importance of the end-to-end encryption of the whole communication rises. This means, that the whole process a user is running through the solution, from the starting point until the end, is completely encrypted. This includes among others: The user registration, the login, the data transfer (send/ receive), the transfer of the key pairs (public/ private key), the storage location on the server, the storage location of the local device as well as the session while a document is edit. In this context it is to advise against separate tools who try to encrypt a non-secure storage. Security and encryption is not a feature, but rather a main function and belongs into the field of activity of the storage vendor. He has to ensure a high integrated security and a good usability at once.

In this context it is also important that the private key for accessing the data and systems is exclusively in the hands of the user. It also should be stored encrypted on the user’s local system. The vendor should have no capabilities to restore this private key. He should never be able to access the stored data. Note: There are cloud storage vendors that are able to restore the private key and are also able to access the user’s data.

Certification for the cloud

Certifications are a further attribute for the quality of storage vendors. Besides the standards like ISO 27001, with which the security of information and IT environments are rated, there also exist national and international certificates by approved certification centers.

These independent and professional certificates are necessary to get an honest statement on the quality and characteristic of a cloud service, the vendor and all down streamed processes like security, infrastructure, availability, etc. Depending on how good the process and the auditor is, a certification can also lead to an improvement of the product, by the auditor proactively gives advices for security and further functionality.

Kategorien
Analysis

The Amazon Web Services to grab at the enterprise IT. A reality check.

The AWS re:Invent 2013 is over and the Amazon Web Services (AWS) continue to reach out to the corporate clients with some new services. After AWS has established itself as a leading infrastructure provider and enabler for startups and new business models in the cloud, the company from Seattle tries to get one foot directly into the lucrative business environment for quite some time. Current public cloud market figures for 2017 from IDC ($107 billion) and Gartner ($244 billion) to give AWS tailwind and encourage the IaaS market leader in its pure public cloud strategy.

The new services

With Amazon WorkSpaces, Amazon AppStream, AWS CloudTrail and Amazon Kinesis, Amazon introduced some interesting new services, which in particular address enterprises.

Amazon WorkSpaces

Amazon WorkSpaces is a service which provides virtual desktops based on Microsoft Windows, to build an own virtual desktop infrastructure (VDI) within the Amazon Cloud. As basis a Windows Server 2008 R2 is used, which rolls out desktops with a Windows 7 environment. All services and applications are streamed from Amazon data centers to the corresponding devices, for what the PCoIP (PC over IP) by Teradici is used. It may be desktop PCs, laptops, smartphones or tablets. In addition, Amazon WorkSpaces can be combined with a Microsoft Active Directory, what simplifies the user management. By default, the desktops are delivered with familiar applications such as Firefox or Adobe Reader/ Flash. This can be adjusted as desired by the administrators.

With Amazon WorkSpaces Amazon enters a completely new territory in which Citrix and VMware, two absolute market players, already waiting. During VMworld in Barcelona, VMware just announced the acquisition of Desktone. VDI is basically a very exciting market segment because it redeemed the corporate IT administration tasks and reduces infrastructure costs. However, this is a very young market segment. Companies are also very careful when outsourcing their desktops as, different from the traditional on-premise terminal services, the bandwidth (network, Internet, data connection) is crucial.

Amazon AppStream

Amazon AppStream is a service that serves as a central backend for graphically extensive applications. With that, the actual performance of the device on which the applications are used, should no longer play a role, since all inputs and outputs are processed within the Amazon Cloud.

Since the power of the devices is likely to be more increasingly in the future, the local power can probably be disregarded. However, for the construction of a real mobile cloud, in which all the data and information are located in the cloud and the devices are only used as consumers, the service is quite interesting. Furthermore, the combination with Amazon WorkSpaces should be considered, to provide applications on devices that serve only as thin clients and require no further local intelligence and performance.

AWS CloudTrail

AWS CloudTrail helps to monitor and record the AWS API calls for one or more accounts. Here, calls from the AWS Management Console, the AWS Command Line Interface (CLI), own applications or third party applications are considered. The collected data are stored either in Amazon S3 or Amazon Glacier for evaluation and can be viewed via the AWS Management Console, the AWS Command Line Interface or third-party tools. At the moment, only Amazon EC2, Amazon ECS, Amazon RDS and Amazon IAM can be monitored. Amazon CloudTrail can be used free of charge. Costs incurred for storing the data to Amazon S3 and Amazon Glacier and for Amazon SNS notifications.

AWS CloudTrial belongs, even if it is not very exciting (logging), to the most important services for enterprise customers that Amazon has released lately. The collected logs assist during compliance by allowing to record all accesses to AWS services and thus demonstrate the compliance of government regulations. It is the same with security audits, which thus allow to comprehend vulnerabilities and unauthorized or erroneous data access. Amazon is well advised to expand AWS CloudTrail as soon as possible for all the other AWS services and make them available worldwide for all regions. In particular, the Europeans will be thankful.

Amazon Kinesis

Amazon Kinesis is a service for real-time processing of large data streams. To this end, Kinesis is able to process data streams of any size from a variety of sources. Amazon Kinesis is controlled via the AWS Management Console by assigning and saving different data streams to an application. Due to Amazon’s massive scalability there are no capacity limitations. However, the data are automatically distributed to the global Amazon data centers. Use cases for Kinesis are the usual suspects: Financial data, social media and data from the Internet of Things/ Everything (sensors, machines, etc.).

The real benefit of Kinesis, as big data solution, is the real-time processing of data. Common standard solutions on the market process the data via batch. Means the data can never be processed direct in time and at most a few minutes later. Kinesis removes this barrier and allows new possibilities for the analysis of live data.

Challenges: Public Cloud, Complexity, Self-Service, „Lock-in“

Looking at the current AWS references, the quantity and quality is impressive. Looking more closely, the top references are still startups, non-critical workloads or completely new developments that are processed. This means that most of the existing IT systems, we are talking about, are still not located in the cloud. Besides the concerns of loss of control and compliance issues, this depends on the fact that the scale-out principle makes it to complicated for businesses to migrate their applications and systems into the AWS cloud. In the end it boils down to the fact, that they have to start from scratch, because a non-distributed developed system is not working the way it should run on a distributed cloud infrastructure – key words: scalability, high availability, multi-AZ. These are costs that should not be underestimated. This means that even the migration of a supposedly simple webshop is a challenge for companies that do not have the time and the necessary cloud knowledge to develop the webshop for the (scale-out) cloud infrastructure.

In addition, the scalability and availability of an application can only be properly realized on the AWS cloud when you stick to the services and APIs that guarantee this. Furthermore, many other infrastructure-related services are available and are constantly being published, which make life clearly easier for the developer. Thus the lock-in is preprogrammed. Although I am of the opinion that a lock-in must not be bad, as long as the provider meets the desired requirements. However, a company should consider in advance whether these services are actually needed mandatory. Virtual machines and standard workloads are relatively easy to move. For services that are very close engaged into the own application architecture, it looks quite different.

Finally. Even if some market researchers predict a golden future for the public cloud, the figures should be taken with a pinch of salt. Cloud market figures are revised downwards for years. You also have to consider in each case how these numbers are actually composed. But that is not the issue here. At the end of the day it’s about what the customer wants. At re:Invent Andy Jassy once again made ​​clear that Amazon AWS is consistently rely on the public cloud and will not invest in own private cloud solutions. You can interpret this as arrogance and ignorance towards customers, the pure will to disruption or just self-affirmation. The fact is, even if Amazon will probably build the private cloud for the CIA, they have not the resources and knowledge by far to act as a software provider on the market. Amazon AWS is a service provider. However, with Eucalyptus they have set up a powerful ally on the private cloud side, which makes it possible to build an AWS-like cloud infrastructure in the own data center

Note: Nearly all Eucalyptus customers should also be AWS customers (source: Eucalyptus). This means conversely, that some hybrid cloud infrastructures exist between on-premise Eucalyptus infrastructures and the Amazon public cloud.

Advantages: AWS Marketplace, Ecosystem, Enabler, Innovation Driver

What is mostly ignored during the discussions about Amazon AWS and corporate customers is the AWS Marketplace. In addition, Amazon also does not advertised it too much. Compared to the cloud infrastructure, customers can use to develop their own solutions, the marketplace offers full-featured software solutions from partners (eg SAP), which can be automatically rolled out on the AWS infrastructure. The cost of using the software are charged per use (hour or month). In addition, the AWS fees for the necessary infrastructure are charged. Herein lies the real added value for companies to easily outsource their existing standard systems to the cloud and to separate from the on-premise systems.

One must therefore distinguish strictly between the use of infrastructure for in-house development and operation of ready-made solutions. Both are possible in the Amazon cloud. There is also the ecosystem of partners and system integrators which help AWS customers to develop their solutions. Because, even if AWS itself is (currently still) a pure infrastructure provider, they must equally be understood as a platform for other providers and partners who operate their businesses on it. This is also the key success and advantage over other providers in the market and will increase the long-term attractiveness of corporate customers.

In addition, Amazon is the absolute driving force for innovation in the cloud, no other cloud provider technologically is able to reach at the moment. For this purpose, it does not require re:Invent. Instead, it shows almost every month anew.

Amazon AWS is – partly – suitable for enterprise IT

Depending on the country and use case the requirements vary, Amazon has to meet. European customers are mostly cautious with the data management and store the data rather in their own country. I already met with more than one customer, who was technically confident but storing the data in Ireland was not an option. In some cases it is also the lack of ease of use. This means that a company dones’t want to (re)-develop its existing application or website for the Amazon infrastructure. Reasons are the lack of time and the knowledge to implement, what may results in a longer time to market. Both can be attributed to the complexity to achieve scalability and availability at the Amazon Web Services. After all, there are not just a few API calls. Instead, the entire architecture needs to be oriented on the AWS cloud. In Amazon’s case its about the horizontal scaling (scale-out) which makes this necessary. Instead, companies prefer vertical scaling (scale-up) to migrate the existing system 1:1 and not to start from scratch, but directly achieve success in the cloud.

However, the AWS references also show that sufficient use cases for companies exist in the public cloud in which the data storage can be considered rather uncritical, as long as the data are classified before and then stored in an encrypted way.

Analysts colleague Larry Carvalho has talked with a few AWS enterprise customers at re:Invent. One customer has implemented a hosted website on AWS for less than $7,000, for what an other system integrator wanted to charge $ 70,000. Another customer has calculated that he would pay for an on-premise business intelligence solution including maintenance about $200,000 per year. On Amazon AWS he only pays $10,000 per year. On the one hand these examples show that AWS is an enabler. However, on the other hand, that security concerns in some cases are yield to cost savings.

Kategorien
Comment

The keys belong into the hands of the users: TeamDrive receives fourth ULD data privacy recertification in a row

The confidence in an encrypted communication for the transmission of sensitive information over the Internet is becoming increasingly important. Here, the Independent Regional Centre for Data Protection of Schleswig-Holstein
(Unabhängiges Landeszentrum für Datenschutz Schleswig-Holstein (ULD)) lead by Thilo Weichert is a pioneer when it comes to independent verification and certification of privacy and security-related issues in the field of information technology. In recent years TeamDrive has established itself as the Dropbox alternative for the enterprise and in terms of safety sets consistently on an end-to-end encryption. The vendor from Hamburg, Germany and Gartner Cool Vendor 2013 in Privacy successfully received its fourth ULD data privacy recertification in a row.

The keys belong exclusively into the hands of the users

As part of the recertification the security of TeamDrive in version 3 was increased again. In addition to the 256-bit AES end-to-end encryption, the security has been further enhanced with an RSA 3072 encryption. The importance of the end-to-end encryption of all communication still rises. This means that the whole process, which a user passes through the solution, is encrypted continuously from start to the end.

In this context it is very important to understand that the private key which is used to access the data and the system only may exclusively be owned by the user. And is only stored encrypted on the local system of the user. The vendor may have no ways to restore this private key and never get access to the stored data.

Despite all the promises, encryption is useless if providers like Dropbox or Box have the key and can decrypt all the data. The control over this exclusively belongs into the hands of the users. This must be taken into account by each company when choosing a trusted vendor.

Kategorien
Analysis

Runtastic: A consumer app in the business cloud

The success stories from the public cloud do not stop. Constantly new web and mobile applications appear that rely on infrastructure-as-a-service (IaaS) offerings. This is just one reason that animates market researcher like IDC ($107 billion) and Gartner ($244 billion) to certify a supposedly golden future for the public cloud by the year 2017. The success of the public cloud cannot be argued away, the facts speak for themselves. However, there are currently mostly startups, and many times non-critical workloads moving to the public cloud. Talking to IT decision-makers from companies that have mature IT infrastructures, the reality looks quite different. This also shows our study on the European cloud market. In particular, the self-service and the complexity is underestimated by the public cloud providers, when they go to corporate customers. But also for young companies the public cloud is not necessarily the best choice. I recently spoke with Florian Gschwandtner (CEO) and Rene Giretzlehner (CTO) of Runtastic who decided against the public cloud and choose a business cloud, more precisely T-Systems.

Runtastic: From 0 to over 50 million downloads in three years

Runtastic consists of a series of apps for endurance, strength & toning, health & wellness and fitness, helping users to achieve their health and fitness goals.

The company was founded in 2009 in Linz (Austria) and has, due to its huge growth in the international mobile health and fitness industry, now locations in Linz, Vienna and San Francisco. The growth figures are impressive. After 100,000 downloads in July 2010, there are now over 50 million downloads (as of October 2013). Of these, 10,000,000 downloads alone in Germany, Austria and Switzerland, which roughly means more than one download per second. Runtastic has 20 million registered users, up to 1 million daily active users and more than 6,000,000 monthly active users.


Source: Runtastic, Florian Gschwandtner, November 2013

In addition to the mobile apps Runtastic is also offering its own hardware to use the apps and services more conveniently. Another interesting service is „Story Running“, which should lead to more variety and action while running.

Runtastic: Reasons for the business cloud

Considering this growth numbers and the number of growing apps and services Runtastic is an ideal candidate for a cloud infrastructure. Compared to similar offerings even for a public cloud. You would think. Runtastic began with its infrastructure on a classic web host – without a cloud model. In recent years, some experience with different providers were collected. Due to the continuing expansion, new underlying conditions, the growing number of customers and the need for high availability the reason was to outsource the infrastructure into the cloud.

Here, a public cloud was never an option for Runtastic because of the dedicated(!) location of the data as a top priority, not only because of security concerns or the NSA scandal. So, the Amazon Web Services (AWS) just for technical reasons are out of the question, since many demands of Runtastic can not be mapped. Among other things, the core database comprises about 500GB on a single MySQL node, which AWS can not meet, according to Giretzlehner. In addition, the cost for such large servers in a public cloud are so expensive that an own hardware quickly pays for itself.

Instead Runtastic decided for the business cloud of T-Systems. This was due to a good setup, a high quality standard and two data centers in one location (in two separate buildings). In addition, a high level of security, a professional technical service as well as the expertise of the staff and partners and a very good connection to the Internet hub in Vienna and Frankfurt.

To that end Runtastic combines the colocation model with a cloud approach. This means that Runtastic covers the base load with its own hardware and absorbs peak loads with infrastructure-as-a-service (IaaS) computing power of T-Systems. The management of the infrastructure is automated by Chef. Although the majority of the infrastructure is virtualized. However Runtastic drives a hybrid approach. This means that the core database, the storage and the NoSQL clusters are operated on bare metal (non-virtualized). Because of the global reach (90 T-Systems data centers worldwide) Runtastic can continue to ensure that all apps work seamlessly and everywhere.

A unconventional decision is the three-year contract, Runtastic closed down with T-Systems. These include the housing of the central infrastructure, a redundant Internet connection and the demand of computing power and storage of the T-Systems cloud. Runtastic chose this model because they see the partnership with T-Systems in the long term and do not want to leave the infrastructure from one day to another.

Another important reason for a business cloud was the direct line to counterparts and professional services. This also reflects in a higher price, what Runtastic consciously takes into account.

The public cloud is not absolutely necessary

Runtastic is the best example that young companies can be successful without a public cloud and that a business cloud is definitely an option. Although Runtastic runs not completely on a cloud infrastructure. However, the business case must be calculated in each case, whether it is (technically) worthwhile to invest parts in an own infrastructure and intercept the peak by cloud infrastructure.

This example also shows, that the public cloud is not easy sledding for providers and consulting and professional services from a business or managed cloud are in demand.

During the keynote at AWS re:Invent 2013 Amazon AWS presented its flagship customer Netflix and the Netflix cloud platform (see the video from 11 minutes), which was developed specifically for the Amazon cloud infrastructure. Amazon sells the Netflix tools of course as a positive aspect, what they definitely are. However, Amazon conceals the effort that Netflix operates in order to use the Amazon Web Services.

Netflix shows very impressive how a public cloud infrastructure is used via self-service. However, when you consider what an effort Netflix operates to be successful in the cloud, you just have to say that cloud computing is not simple and a cloud infrastructure, no matter what kind of provider, needs to be built with the corresponding architecture. This means, conversely, that the use of the cloud does not always lead to the promised cost advantages. In addition to savings in infrastructure costs which are always pre-calculated, the other costs may never be neglected for the staff with the appropriate skills and the costs for the development of scalable and fault-tolerant application in the cloud.

In this context, the excessive use of infrastructure-near services should be reconsidered. Although these simplify the developers life. As a company, you should think in advance whether these services are actually absolutely needed to keep your options open. Virtual machines and standard workloads are relatively easy to move. For services that engage very close to the own application architecture, it looks different.

Kategorien
Comment

Disruptive world of IT: New technologies constantly change enterprises status quo. #tsy13

At first, Zero Distance sounds like one of these new marketing phrases, with which vendors try to tell us how to make the world a better place. Well, it is one of these marketing phrases. But one in which can be found a lot of truth. Looking at the use cases that were shown at the T-Systems Symposium 2013 and further from all over the world, it is clear what potentials modern information technologies are offering. Here cloud and cloud computing are just a means to the end and serve as an enabler for new business models and help to change our world.

Who does not change dies out!

A fact, traditional companies are doomed, if they do not try to change. To hug the old assets is not the best strategy – especially in the age of digital business. Startups seem to appear from nowhere and overrun the market leaders in its areas, which do not have any chance to react at the same pace. It is the advantage of the greenfield startups using and do not have to deal with nuisance inherited liability of IT and further areas. But there are also companies who are successful on the market for quite a long time and understand the signs of the times. New technologies and concepts always have some influence on the business. Some companies have understood to reinvent themselves and profitable use cloud computing, big data, mobile and collaboration for their own purposes to change. Others, however, cannot or will not understand it and prefer to remain faithful to their status quo.

Hey farmer, where is the bull?

It is frequently surprising in what kind of industries information technology has a massive impact and thus leading to more efficiency. Let’s take for example agriculture. More concrete, the mating behavior of cows. This is important, because a cow just can give milk when she freshened. For the farmer it is therefore very important, that everything works frictionless.

The solution: Is a cow in heat, she is doing some typical head movements. Therefore the cow wears a collar including a mobile communications chip. The farmer gets the needed information to make the bull ready. Even for the freshen the solution helps. Here the tool transmits the values of a thermometer with an integrated SIM card. About 48 hours before the birth the body temperature of the cow changes. Two hours before the birth the farmer gets a SMS to be on site in time.

IT departments need to act more proactively

IT departments are and have always been the scapegoats in the company. Justifiably? Well, some yes, others not. But, is the IT department responsible on its own for the transformation of the business? Yes and no. Primarily, the management is responsible for the alignment of the corporate strategy. They have to say which way the company should go, finally they have the vision. But, the big question is the way the IT department behaves in this context. Is it just the supportive force that responds as necessary to the needs of the management and colleagues, or rather acting proactively?

Offense is the best defense. Today, IT departments should be really close to the technical and innovative pulse and be informed about the changes in the market. Based on a permanent internal or external market and trend research they need to know what to expect for themselves and possibly for the business site and respond as quick as possible and proactively, not to lose too much time, and in the best case to create a competitive advantage. Depending on the trend, they do not need to jump on every bandwagon, but they should at least deal with it and understand the influence, and whether they or their company are affected. If they see potential for new business models, they should carry these into the management, which also needs to understand that IT today is an enabler and not just maintenance. This means that the IT department in the company today has a much larger role as it had ten years ago.

To that end, the management needs to give the IT department a hand and rid the IT department of their routine tasks. About 80 percent of IT spending’s today are invested in IT operations, only to keep things running. These are investments in the status quo and do not lead to innovation. By contrast, only 20 percent of expenditures are invested in improvements or further developments. This ratio must twist, and the management together with the CIO have the task to make this change, so that a company remains innovative and competitive in the future.

Stop hugging the status quo.

Kategorien
Analysis

Airbus separates the traveler from the luggage #tsy13

During the T-Systems Symposium 2013, Airbus has introduced Bag2Go an intelligent suitcase which can travel completely independently and transported to the destination without the owner. Travelers Airbus want to allow more flexibility and mobility by arriving the target free of luggage by plane, ship, train, car, bike or by foot. The Bag2Go service relies on the business cloud infrastructure of T-Systems.

Background of Bag2Go

A Bag2Go is self-weighing, self-labeling, self-travelling, self-tracing and has features of a continuous tracking. In addition, according to Airbus, it consistently adhere with all common standards today, so that the current infrastructure at airports must not be changed. Using a smartphone app the status and whereabouts can constantly be monitored.

Airbus plans to cooperate with different transport services (eg DHL). So, for example, the traveler can fly by plane to the destination, but his baggage arrive by land, ship or another aircraft to the target. Package services or door-to-door luggage transportation services to provide transportation of luggage to any destination in inland or abroad in a hotel or cruise booking.

A follow-up is generally convenient, but makes only sense if one can also intervene in the action. For this purpose, Airbus relies on a GPS tracking in conjunction with a transport service provider, such as transportation / logistics or the airline.

Motivation of Airbus

Airbus has obviously developed the concept Bag2Go not without self-interest. The point is to make the first steps for the aircraft of the future. This means that Airbus is building sustainable and ultra-light aircrafts, for which is as little weight as possible required. At the same time, fuel consumption is significantly decreasing. To remove the heavy luggage from the plane is the key component. The separation of passenger and baggage streams to enable Airbus and the entire industry to control the carriage of baggage proactively and open up completely new business opportunities.

According to Airbus, the containerization of luggage is continuously gaining in importance as more and more people will take a door-to-door transport in and claim their baggage up to three days prior to departure or use a baggage kiosk. For this reason, Airbus will try to establish fully networked transport capsules as a standard.

Interesting approach – but not for everyone

Airbus faces with his concept less technical but more legal problems. Means a traveler and his luggage need to fly together. A suitcase can fly afterwards the passenger, but not ahead. In addition, business executives challenged the three days in advance luggage check-in during the presentation, because they pack their suitcase itself up to one hour before the flight. A few more conversations after the presentation confirmed the general opinion and attitudes of business travelers on the subject.
​​
Nevertheless, Bag2Go is an interesting concept we will see soon in reality. At the same time it is a great use case for the Internet of Things. Bag2Go forms an analogous to the architecture of the Internet routing, where a single IP packet, normally associated with a complete data stream, can also take different paths to the goal.

Kategorien
Analysis

The fully interconnected world becomes reality #tsy13

Companies are constantly exposed to shorter cycles of change in their everyday lives. The consumerization of IT is a great driver that will retain an important role in the future. For several years mobility, cloud, new applications, the Internet of Things and Big Data (analysis) show their effects. Besides these technological influences there is also the business impact, such as new business models, constant growth, globalization, and at the same time issues of security and compliance that lead to new challenges.

The evolution of the Internet leaves its marks

Looking at the evolution of the Internet, it is clear that the impact on companies and our society grow with the growth of smart connections. Started with simple connections via e-mail or web and therefore the digitization of information access, followed by the networked economy, in which it came with e-commerce and collaboration to digitize business processes. This was followed by something Cisco called as „realistic experiences“. This includes the digitization of business and social interactions such as social media, mobile and cloud. The next state we will achieve is the „Internet of Everything“, where people, processes, data and things are connected – the digitization of the world.

Each block has its own importance in the Internet of Everything. The connections between people become more relevant and valuable. The right person or machine receives the right information about intelligent processes at the right time. The data are processed into valuable information for decision making. Physical devices (things) are connected via the Internet with each other to enable intelligent decisions.

At the T-Systems Symposium 2013 in Dusseldorf, Cisco said, that they expect about 50 billion Smart Objects in 2020 that are connected to each other. On the contrary, there are just 7.6 billion people expected worldwide. Furthermore, Cisco is of the opinion that currently 99 percent of the world is not interconnected and that we will see trillions of smart sensor in the future. These are installed e.g. in intelligent buildings, cities or houses and help to save energy and to live more efficiently. But they will also lead to increase productivity or improve healthcare.

Technological foundations for the fully interconnected world

Due to its massive scalability, distribution, new applications and the possibilities for anywhere access, the cloud is one of the main bases for the fully interconnected world. Therefore, Cisco’s cloud strategy is to enable cloud providers and enterprises to deploy and broker different kinds of cloud services. At the same time Cisco application platforms should be provided for different of cloud categories.

However, the cloud needs to be enriched by approaches from the Internet of Things. These include intelligent business processes, people to machine-2-machine communication as well as other things such as sensors and actuators which are distributed everywhere. In the end, a worldwide highly scalable infrastructure is necessary that can control temporal variations and requirements between the different workloads.

Another key component is therefore something that Cisco called Fog Computing. The fog hast he task to deliver data and workloads closer to the user who is located at the edge of a data connection. In this context it is also spoken about „edge computing“. The fog is organizationally located below the cloud and serves as an optimized transfer medium for services and data within the cloud. The term „fog computing“ was characterized by Cisco as a new paradigm , which should support distributed devices during the wireless data transfer within the Internet of Things. Conceptual fog computing builds upon existing and common technologies like Content Delivery Networks (CDN), but based on cloud technologies it should ensure the delivery of more complex services.

As more and more data must be delivered to an ever-growing number of users, concepts are necessary which enhance the idea of the cloud and empower companies and vendors to provide their content over a widely spread platform to the enduser. Fog computing should help to transport the distributed data closer to the enduser and thus decrease latency and the number of required hops and therefore better support mobile computing and streaming services. Besides the Internet of Things, the rising demand of users to access data at any time, from any place and with any device, is another reason why the idea of fog computing will become increasingly important.

So, the essential characteristic of Fog Computing is that workloads can autonomously operate from the cloud to ensure the rapid, timely and stable access of a service.

The interconnected world needs intelligent technologies

To use the cloud as the fundamental basis for the Internet of Things and the Internet of Everything, some hurdles still need to convey out of the way.

The current cloud deployment model set aside that the data are normally delivered directly and without an intermediate layer to the device or the end user. (Remark: There already exist CDN and edge locations, which thus provide caching and acceleration.) It is therefore assumed that the bandwidth for transmission of the data is maximum, and theoretically there is a delay of zero. However, this is only theoretically. In practice, high latencies and thus delays, poor elasticity and compared to the bandwidth significantly faster increasing amounts of data, lead to problems.

For this reason, approaches such as fog computing are required to be able to assume that a limited bandwidth, different delay times and dropped connections are available yet. Ideas such as the fog establish themselves as an intelligent intermediate layer, cache or as a kind of amplifier and coordinate the requirements of different workloads, data and information so that they are delivered independently and efficiently to the objects in the Internet of Everything.