Description for Students
Data Files : Project A:B Testing
Overview of A/B Testing
A/B testing, or split testing, is a method used to compare two versions of a webpage or product to determine which one performs better. The comparison is based on a specific metric, such as conversion rate, click-through rate, or time spent on the page. One version (A) is the control, while the other (B) is the variation. By randomly showing different users either version A or B and measuring the outcomes, businesses can make data-driven decisions.
Project Goals
Students will perform an A/B test analysis to determine whether a new webpage design leads to a higher conversion rate compared to the current design. The project will involve the following steps:
- Preliminary Analysis using Google Sheets
- Data Cleaning and Insights Generation using Python
- Visualization and Dashboard Creation using PowerBI connected through SQL
Parameters to Test
- Conversion Rate: The percentage of visitors who take the desired action (e.g., sign up, purchase).
- Bounce Rate: The percentage of visitors who leave the site after viewing only one page.
- Average Time on Page: The average amount of time users spend on a webpage.
Hypothesis Testing
- Null Hypothesis (H0): The new webpage design does not significantly affect the conversion rate.
- Alternative Hypothesis (H1): The new webpage design significantly affects the conversion rate.
Process
- Preliminary Findings with Google Sheets
- Collect the raw data from the A/B test.
- Use Google Sheets to perform initial data exploration, such as summarizing key metrics and creating basic visualizations.
- Document any preliminary insights or observations that might inform further analysis.