Cloud Computing Challenges

Availability

It may seem ironical to discuss availability as a major challenge of Cloud Computing. After all, isn’t Cloud Computing the most preferred solution when dealing with availability? The assumption that many of us make is that mirrored data, great backups, and multiple data centers are the iconic features of a Cloud solution. It is true that almost all Cloud solution providers have effective backup solutions. That is the baseline of each Cloud service. But backing-up and restoring does not answer all your Cloud computing needs. Availability is a key element. Without availability, it becomes hard to schedule the backups and restore operations.

Achieving high availability has become quite a daunting task. Most Cloud providers promise that they can offer an availability of up to 99.99% (Danseglio, 2012). In the real world, this implies that within a 30-day month, you will have less than 5 minutes of downtime, or about 50 minutes of being offline in a year. However, that is not always the case. With network connectivity issues and sometimes unexpected outages, such kind of availability is only imaginary. Very few Cloud service providers have managed respect such a guarantee. (Danseglio, 2012).

When a company is enrolling its systems to the Cloud, it has to look for a provider that not only promises but also guarantees availability. Do not just take their word. Rather, make sure that there is some kind of commitment from their side. That is, if you were to face given availability issues, you would expect the provider to compensate you for the lost time. A detailed explanation as to how this availability is assured is also required (Mesbahi, Rahmani, & Hosseinzadeh, 2018).

It is important note that different Cloud service providers have their own definition and measurement regarding availability. Thus, reviewing the provider’s definition of availability is essential as the company may have to adjust its requirement to reflect how availability is measured.

Data Residency

Countries are continuously coming up with data residency laws with the goal of protecting private data yielded from the country’s citizens by making it mandatory for this information to be stored within the country (Hyun, 2018). A common assumption is that data laws made within a country apply to the data stored within the given country. As a result, large Cloud providers such as Salesforce, Microsoft, and Amazon are being forced to open data centers in various countries around the globe in order to comply with such regulations.

The issue of data residency is quite problematic to organizations. As much as they may want to take up on Cloud Computing, if they store key customer data on data centers that are outside the country’s borders, they literally break the data residency laws and may risk legal action. As such, some companies resort to traditional Cloud options while waiting for a local Cloud provider to set up a data center (Hyun, 2018).

The increase in data residency risks is fueled by concerns that some jurisdictions may have more flexible data privacy laws. For instance, European Union countries are of the perception that they have the most effective data protection laws than any other country. Thus, the EU insists that particular data sets should be stored within their boundaries or provision be made for it to sign agreements with other countries extending the protection to where the data ought to be stored.

Cloud Computing increases risk when it comes to data residency. The moment you upload data to the Cloud service, you are no longer in control of that information. Even though you are promised of better encryption of this data, the Cloud service provider still holds the decryption keys and can use at any time when it is deemed necessary. A major concern is that private organizations or government agencies could obtain court orders to force the Cloud provider to open the data to certain people. In most cases, the data owner may not even be aware of the progress of such a legal process (CloudMask, n.d.).

Multi-tenancy

Most people confuse multi-tenancy with multi-enterprise or multi-user. Multi-tenancy is a completely different concept on its own with its significance. Whether the organization has chosen the Private Cloud or Public Cloud, it is essential that you have a detailed understanding of the nuances of multi-tenant architecture (Kajeepeta, 2010). When it comes to Public Clouds, the stakeholders should be aware of the extent to which multi-tenancy is supported by the vendor in question. Regarding Private Clouds, the organization is solely charged with creating their own multi-tenant architecture.

It is worth noting that a tenant does not merely refer to the organization use of Public Cloud or being in full control of Private Cloud. In real sense, a tenant refers to an application which requires a personal secure and exclusive computing environment (Rouse, Multi-tenancy, 2014). The environment consists of storage space, enterprise architecture, and user interface. Problems are bound to be faced as a result of the exclusivity demanded by each interactive tenant (applications).

The extent of multi-tenancy relies on the manner in which the core application is designed for tenants to share. The highest multi-tenancy extent supports customization of the business logic. In the lowest extent, there is a limitation on multi-tenancy.

Performance

One of the many promises that Cloud Computing makes has to do with increased agility at a reduced cost. However, as you adopt the technology, risks are bound to arise and may even hinder the achievement of promised high performance. Most of the obstacles that block the uptake of Cloud Computing are related to performance aspects (Techopedia, n.d.).

The following are some of the performance hurdles being faced by organizations which would like or have taken up Cloud Computing solutions:

  • Determining the right application that fits your Cloud – Even though the cloud promises to be all-inclusive, it does not mean that all applications are suitable for it. As companies perform their migration to the Cloud, they have the challenge of settling on an application which would improve in performance when transferred to the Cloud. It can be quite hard to determine this for sure. The involvement of external consultants very necessary and even if this would entail to more costs.
  • Topological dependencies –The migration to Cloud introduces the problem of having to give up a primarily static network for dynamic network architecture. The network architecture still needs firewall, security services, and load balancing.
  • Keeping up with service consumption – Assessing the manner in which a resource is consumed helps check whether you are achieving peak performance or you still have room for improvement. The organization is required to put in place a solution that can keep tabs on how consumption occurs. The allocation of resources also needs to be prioritized, and this is best done when end user performance is measured.

Dealing with performance issues in the Cloud is not an easy matter. The difficulties arise from the fact that this is a fairly new technology, implying that solutions to various problems are yet to be fully rolled out (Rothstein, 2011).

Data Evacuation

Natural and man-made emergencies can be detrimental to the economy, environment, and human life. Similarly, natural disasters may be of great impact to data stored in the Cloud. We live in an era where data breaches are commonplace. One day your data could be lying safely in the Cloud, and the next you hear of news about lost encryption details that can be used to steal your data.

The moment your Cloud provider loses, the safety of your data is at the mercy of the bad-faith of hackers. They may play around with the encryption keys until they find their way to the organization’s data, holding you hostage until you pay a lump sum. During this time, the organization loses vital business and may take ages to recover. That is why data evacuation is necessary the moment you face such threats (Adams, 2017).

Unfortunately, the lack of Cloud interoperability makes it hard to execute data evacuation (Claybrook, 2011). Moving an application from a high risk Cloud provider to a location deemed to be safer is faced with a number of challenges including:

  • The need to rebuild the application so that it can be used in the target Cloud
  • Challenges in managing the application as it migrates from a Cloud whose user interface you are well versed with to a new target Cloud
  • Handling movement of data during the evacuation

The only way to ease data evacuation is for Cloud durabilty to be enhanced. Users must be able to move data from one Cloud to another without facing hindrances. However, that is still far from being the case as of now.

Supervisory Access

Even though Cloud Computing introduces a host of opportunities; it also brings about various challenges which make it necessary to have supervisors in place. The fact that data can be stolen from the Cloud and crumble the entire organization requires the need for supervisory access. Through management of access, it is possible to tell who accessed what data and the manner in which that data was used.

There are several supervisory expectations that prove to be problematic when it comes to Cloud Computing:

  • Contracts – Cloud providers constantly change their policies regarding access to data. As a result, the organization’s management finds it challenging keeping up with the different service-level agreements, particularly those concerning information security management. It is the management’s responsibility to check that the provider complies with their obligations of allowing certain levels of access to the organizational data.
  • Control – There is a strong need to ensure that Cloud provider controls are evaluated with the goal of assessing how much control your provider has over your data. The more control they have, the less access you have to the data. Each provider has its own policies concerning data control. The control should be in such a way that it does not violate the rights of the original data owners – the people from whom this data is generated.

Leave a Reply

Your email address will not be published. Required fields are marked *