Performance, Scalability, and Reliability (PSR) challenges, metrics and tools for web testing : A Case Study
Context. Testing of web applications is an important task, as it ensures the functionality and quality of web applications. The quality of web application comes under non-functional testing. There are many quality attributes such as performance, scalability, reliability, usability, accessibility and...
Main Authors: | , |
---|---|
Format: | Others |
Language: | English |
Published: |
Blekinge Tekniska Högskola, Institutionen för programvaruteknik
2016
|
Subjects: | |
Online Access: | http://urn.kb.se/resolve?urn=urn:nbn:se:bth-12801 |
id |
ndltd-UPSALLA1-oai-DiVA.org-bth-12801 |
---|---|
record_format |
oai_dc |
spelling |
ndltd-UPSALLA1-oai-DiVA.org-bth-128012018-01-11T05:11:56ZPerformance, Scalability, and Reliability (PSR) challenges, metrics and tools for web testing : A Case StudyengMagapu, Akshay KumarYarlagadda, NikhilBlekinge Tekniska Högskola, Institutionen för programvaruteknikBlekinge Tekniska Högskola, Institutionen för programvaruteknik2016Web applicationsWeb testingPerformanceScalabilityReliabilityQualitySoftware EngineeringProgramvaruteknikContext. Testing of web applications is an important task, as it ensures the functionality and quality of web applications. The quality of web application comes under non-functional testing. There are many quality attributes such as performance, scalability, reliability, usability, accessibility and security. Among these attributes, PSR is the most important and commonly used attributes considered in practice. However, there are very few empirical studies conducted on these three attributes. Objectives. The purpose of this study is to identify metrics and tools that are available for testing these three attributes. And also to identify the challenges faced while testing these attributes both from literature and practice. Methods. In this research, a systematic mapping study was conducted in order to collect information regarding the metrics, tools, challenges and mitigations related to PSR attributes. The required information is gathered by searching in five scientific databases. We also conducted a case study to identify the metrics, tools and challenges of the PSR attributes in practice. The case study is conducted at Ericsson, India where eight subjects were interviewed. And four subjects working in other companies (in India) were also interviewed in order to validate the results obtained from the case company. In addition to this, few documents of previous projects from the case company are collected for data triangulation. Results. A total of 69 metrics, 54 tools and 18 challenges are identified from systematic mapping study. And 30 metrics, 18 tools and 13 challenges are identified from interviews. Data is also collected through documents and a total of 16 metrics, 4 tools and 3 challenges were identified from these documents. We formed a list based on the analysis of data that is related to tools, metrics and challenges. Conclusions. We found that metrics available from literature are overlapping with metrics that are used in practice. However, tools found in literature are overlapping only to some extent with practice. The main reason for this deviation is because of the limitations that are identified for the tools, which lead to the development of their own in-house tool by the case company. We also found that challenges are partially overlapped between state of art and practice. We are unable to collect mitigations for all these challenges from literature and hence there is a need for further research to be done. Among the PSR attributes, most of the literature is available on performance attribute and most of the interviewees are comfortable to answer the questions related to performance attribute. Thus, we conclude there is a lack of empirical research related to scalability and reliability attributes. As of now, our research is dealing with PSR attributes in particular and there is a scope for further research in this area. It can be implemented on the other quality attributes and the research can be done in a larger scale (considering more number of companies). Student thesisinfo:eu-repo/semantics/bachelorThesistexthttp://urn.kb.se/resolve?urn=urn:nbn:se:bth-12801application/pdfinfo:eu-repo/semantics/openAccess |
collection |
NDLTD |
language |
English |
format |
Others
|
sources |
NDLTD |
topic |
Web applications Web testing Performance Scalability Reliability Quality Software Engineering Programvaruteknik |
spellingShingle |
Web applications Web testing Performance Scalability Reliability Quality Software Engineering Programvaruteknik Magapu, Akshay Kumar Yarlagadda, Nikhil Performance, Scalability, and Reliability (PSR) challenges, metrics and tools for web testing : A Case Study |
description |
Context. Testing of web applications is an important task, as it ensures the functionality and quality of web applications. The quality of web application comes under non-functional testing. There are many quality attributes such as performance, scalability, reliability, usability, accessibility and security. Among these attributes, PSR is the most important and commonly used attributes considered in practice. However, there are very few empirical studies conducted on these three attributes. Objectives. The purpose of this study is to identify metrics and tools that are available for testing these three attributes. And also to identify the challenges faced while testing these attributes both from literature and practice. Methods. In this research, a systematic mapping study was conducted in order to collect information regarding the metrics, tools, challenges and mitigations related to PSR attributes. The required information is gathered by searching in five scientific databases. We also conducted a case study to identify the metrics, tools and challenges of the PSR attributes in practice. The case study is conducted at Ericsson, India where eight subjects were interviewed. And four subjects working in other companies (in India) were also interviewed in order to validate the results obtained from the case company. In addition to this, few documents of previous projects from the case company are collected for data triangulation. Results. A total of 69 metrics, 54 tools and 18 challenges are identified from systematic mapping study. And 30 metrics, 18 tools and 13 challenges are identified from interviews. Data is also collected through documents and a total of 16 metrics, 4 tools and 3 challenges were identified from these documents. We formed a list based on the analysis of data that is related to tools, metrics and challenges. Conclusions. We found that metrics available from literature are overlapping with metrics that are used in practice. However, tools found in literature are overlapping only to some extent with practice. The main reason for this deviation is because of the limitations that are identified for the tools, which lead to the development of their own in-house tool by the case company. We also found that challenges are partially overlapped between state of art and practice. We are unable to collect mitigations for all these challenges from literature and hence there is a need for further research to be done. Among the PSR attributes, most of the literature is available on performance attribute and most of the interviewees are comfortable to answer the questions related to performance attribute. Thus, we conclude there is a lack of empirical research related to scalability and reliability attributes. As of now, our research is dealing with PSR attributes in particular and there is a scope for further research in this area. It can be implemented on the other quality attributes and the research can be done in a larger scale (considering more number of companies). |
author |
Magapu, Akshay Kumar Yarlagadda, Nikhil |
author_facet |
Magapu, Akshay Kumar Yarlagadda, Nikhil |
author_sort |
Magapu, Akshay Kumar |
title |
Performance, Scalability, and Reliability (PSR) challenges, metrics and tools for web testing : A Case Study |
title_short |
Performance, Scalability, and Reliability (PSR) challenges, metrics and tools for web testing : A Case Study |
title_full |
Performance, Scalability, and Reliability (PSR) challenges, metrics and tools for web testing : A Case Study |
title_fullStr |
Performance, Scalability, and Reliability (PSR) challenges, metrics and tools for web testing : A Case Study |
title_full_unstemmed |
Performance, Scalability, and Reliability (PSR) challenges, metrics and tools for web testing : A Case Study |
title_sort |
performance, scalability, and reliability (psr) challenges, metrics and tools for web testing : a case study |
publisher |
Blekinge Tekniska Högskola, Institutionen för programvaruteknik |
publishDate |
2016 |
url |
http://urn.kb.se/resolve?urn=urn:nbn:se:bth-12801 |
work_keys_str_mv |
AT magapuakshaykumar performancescalabilityandreliabilitypsrchallengesmetricsandtoolsforwebtestingacasestudy AT yarlagaddanikhil performancescalabilityandreliabilitypsrchallengesmetricsandtoolsforwebtestingacasestudy |
_version_ |
1718604163827367936 |