co-located with ICST 2014, Cleveland, Ohio, USA
Cloud computing is everywhere, inevitable: originally a layered abstraction of an heterogeneous environment, it has become the paradigm of a large-scale data-oriented system. And while it has some interesting features (easy deployment of applications, resiliency, security, performance, scalability, elasticity, etc.), testing its robustness and its reliability is a major challenge. The Cloud is an intricate collection of interconnected and virtualised computers, connected services, complex service-level agreements. From a testing perspective, the Cloud is then a complex composition of complex systems, and one can wonder whether anything like a global testing is possible? But if the answer is no, what can we conclude from partial tests? The question of testing this large, network-based, dynamic, composition of computers, virtual machines, servers, services, SLAs, seems particularly difficult. And critical for Cloud vendors: customers’ trust is indeed crucial for companies implementing Clouds, and they have to ensure that the system has all the security and performance characteristics the marketing department highlights. This problem is a perfect example of cross concerns between academia and product companies, and it covers a broad range of topics, from software development to code analysis, performance monitoring to formal model for system testing, and so on.
In TTC, we aim at bringing together researchers and practitioners interested in this difficult question of testing the cloud, ie. a complex distributed, dynamic and interconnected system. Hence, we call for regular scientific submissions, but also for industrial experience feedback. We are interested in contributions related to 'testing the Cloud' (i.e., testing the Cloud itself, for instance, its infrastructure), 'testing in the Cloud' (i.e., testing applications that are deployed in the Cloud), and 'testing with the Cloud' (e.g., using the Cloud capabilities to perform stress testing on an application). All the submissions describing approaches used in the industry, defining new methods to facilitate testing or identifying new challenges are relevant.
"Testing the Cloud" covers many different topics, much more than the list we wrapped up below. So we welcome academic and industrial contributions that sound relevant - whatever is the background of the authors. In particular, we will run regular academic sessions, but we are also likely to have a more industry-focused session where it will be possible to describe solutions deployed in product companies or best practices followed by practitioners.