Implementing automated testing tools was essential for restructuring our development organization. It led to fewer regressions and improved quality awareness.


Browser testing

In the 18th edition of our user interviews, we had the opportunity to speak with representatives of ASHITA-TEAM Co., Ltd., alongside Mr. Ito, CEO of MagicPod. We discussed various topics, including specific use cases and the decisive factors for selecting MagicPod.


Established over ten years ago, ASHITA-TEAM is an HR Tech company supporting the DX of the human resources domain as experts in personnel evaluation systems. The company offers  one-stop services from the construction of personnel systems to operational support, and business efficiency improvements using cloud systems.


  • Faced quality challenges due to the complete absence of E2E testing.
  • Selected MagicPod for its unlimited execution capability.
  • Degradation detection led to a reduction in rollbacks and improved quality awareness.
  • Achieved significant results within six months of implementation.

From left to right:Mr. Masato Horinouchi, Executive Officer VPoE, Head of Product Department / Mr. Hidekuni Kajita, Product Department Manager / Mr. Satoshi Azumi, Product Department, QA Engineer / Mr. Ritsuki Fujiwara, Product Department, QA Engineer

From left to right:
Mr. Masato Horinouchi, Executive Officer VPoE, Head of Product Department
Mr. Hidekuni Kajita, Product Department Manager
Mr. Satoshi Azumi, Product Department, QA Engineer
Mr. Ritsuki Fujiwara, Product Department, QA Engineer

Mr. Horinouchi (hereafter referred to as Horinouchi): As the VPoE and head of the product department, I am responsible for development. My main tasks include not only building development systems and improving development methods but also coordinating renovation projects with CS and sales, as well as recruiting and searching for partners. When I joined the company, the development organization was in need of restructuring, and there was hardly any testing being done. That's when I asked Mr. Azumi to join and help establish a QA system.

Mr. Azumi (hereafter referred to as Azumi): I joined ASHITA-TEAM around 2022. I started by helping set up the QA team and currently handle the maintenance of "ASHITA-CLOUD®". Specifically, I list and prioritize scenarios, and implement and maintain them in MagicPod.
The QA team has six members, divided into teams of three each for manual and automated testing. I am in a managerial position in the automated testing team, organizing tasks and making policy decisions. I have experience with various automated testing tools at different companies, from implementation to operation, including experience with implementing MagicPod at another company.

Mr. Fujiwara (hereafter referred to as Fujiwara): This is my first time working as a QA engineer, and I am mainly responsible for scenario creation and MagicPod implementation. Having worked as a development engineer, I encountered projects that struggled with maintenance when the engineers who had written the test code  left, thus sparking my interest in the QA field. It was at that time Mr. Azumi introduced me to this job.

Horinouchi: In fact, Mr. Fujiwara and another member were introduced to us by Mr. Azumi.

Mr. Ito (CEO of MagicPod): Mr. Azumi's experience and ability to bring in QA engineers with development experience are truly impressive.

Horinouchi: Indeed. We lacked testing expertise, so we needed experienced individuals. Yet, finding QA engineers, especially those who can implement automated testing, is challenging. Luckily, Mr. Azumi was introduced to us, and I urgently requested his involvement. His ability to find and bring in the essential talent we needed has truly been invaluable to us.

Mr. Kajita (hereafter referred to as Kajita): Currently, I work as a backend engineer on new development projects and oversee the infrastructure of "ASHITA-CLOUD®" from an SRE perspective. Although I don't directly interact with MagicPod, I set MagicPod to run during deployments in the staging environment, send notifications to Slack, and I discuss with the QA team how to integrate QA practices and processes.

Journey to Implementing MagicPod

Ito: Mr. Horinouchi, you mentioned that the development organization needed restructuring when you joined. What was the situation like back then?

Horinouchi: There were some unit tests, but no E2E tests at all. It seemed like the engineers had taken the initiative to start testing but eventually gave up, realizing it was too much to handle on their own.

Azumi: When I first joined, there were three major issues. First, as Mr. Horinouchi mentioned, E2E tests were not being conducted, which led to frequent regressions because we were releasing without a complete understanding of the impact. Second, 'product quality' was a concern, with defects occurring in the production environment and technical debt accumulating. Third, regarding 'knowledge in automated testing,' although its importance was recognized, only a few members understood which tools to use and how to implement them.

Horinouchi: While there were no critical issues, we were in a situation where neglected bugs were managed in operations. Without clarity on the priority of repair requests, it was unclear who was responsible for what repairs and by when, leading engineers to often say they were 'busy, busy' without a clear reason. So, we introduced Scrum development, setting up meetings for sprint reviews and product backlog refinement, creating a framework where business owners decide ‘what to do and not to do,’ and establish priorities for releases.
However, without QA, defects would arise upon release, necessitating the urgent formation of a QA team. Thus, we asked Mr. Azumi to take on automated testing, which proved challenging due to the lack of PRD, DesignDoc, or any equivalent specifications.

Azumi: The lack of documents like specifications is a common issue in many companies (laughs). I thought that most problems could be solved by implementing an automated testing tool.

Decisive Factor in Choosing MagicPod

Ito: When starting testing, it's common to begin manually then move to automation if problems arise. Were you thinking of automation from the start?

Horinouchi: Yes, indeed. We thought it unreasonable for development engineers to handle QA in their spare time, and since 'ASHITA-CLOUD®' is large scale, manual testing alone is impractical. Mr. Azumi proposed 'let's automate,' so it was a timely suggestion.

Ito: How did you evaluate the tools?

Azumi: We considered several but ultimately compared two tools that support Japanese. From there, we chose MagicPod for its unlimited execution count, affordability, and ease of maintenance. Mr. Horinouchi expressed the need to 'run tests daily,' so the fact that the cost does not change with the number of executions was a decisive factor.

Horinouchi: Particularly in agile development, testing is essential and must be performed iteratively. When asked at work, 'When do you conduct tests?' I often reply, 'Always' (laughs). I feel the industry has a low awareness of automated testing, often having conversations like, 'Who creates the automated tests?' 'QA engineers.' 'How?' 'By creating a test item document and implementing it.' 'Wow, that's something new!'

Ito: As mentioned earlier, there were no documents like specifications. How did you proceed with the implementation?

Azumi: It was a time of high personnel turnover, making it unclear whom to ask about using the service. Our initial focus turned to the CS documentation. CS makes easy-to-understand materials for customers that explain "this feature is used like this," which we utilized as references to list scenarios and determine their priorities.

Horinouchi: It felt like a lot of time was spent on listing.

Azumi: Yes. We prioritized them as "high, medium, low," ending up with about 60-80 scenarios. We decided to implement up to "medium," about 40 scenarios, and completed them in about half a year.

Horinouchi: Now that you both have completed the normal operations of 'ASHITA-CLOUD®', I'm asking you to implement and integrate it into new business products. However, given that the product is still in development, with requirements and specifications being fluid, we're somewhat uncertain how far to push the automated testing.

Ito: If the UI changes, it will require evaluation. I think it's great that automated testing has become an integral part of the development process.

Utilization of MagicPod

Ito: How are tests currently being executed?

Fujiwara: We run them regularly at 9 a.m. every morning. In the future, we're discussing enabling it to run during deployment as well.

Kajita: For ‘ASHITA-CLOUD®,’ we execute unit tests and request specs (integration tests) from CircleCI. For new business products, we activate MagicPod via GitHub Actions at the time of pushes to the master branch and in regular executions.

Ito: Are you seeing benefits in having automated testing?

Horinouchi: Being able to detect degradations before releases has reduced rollbacks. Test results are notified on a Slack channel that includes development members. Previously, due to an overwhelming number of defects, there was a tendency to 'ignore the alert channel,' but now, when an alert that usually doesn't appear shows up on the MagicPod channel, it's regarded seriously.

Azumi: By sharing "Now, this is failing, and this is how we're addressing it" during our weekly meetings, everyone started to take an interest, and began to actively check on their own. I believe this has raised awareness of quality.

Ito: That's excellent. Automated tests often have false positives, so the way tests are created and implemented is crucial. How have you been innovating in this area?

Fujiwara: In our weekly gatherings, we've improved accuracy by setting rules, such as for validating UI elements and establishing wait rules until screen rendering is completed before moving to the next step. Clarifying these rules has made implementation smoother.

Azumi: We also make effective use of shared steps. Combining small parts into a single scenario makes maintenance easier and reduces the likelihood of failures. We've been layering such maintenance to prevent failures.

Ito: I see, that's really good.

Horinouchi: In the previous development setting, we operated in a black box scenario, where it was unclear what specifications we were developing to, and how or where improvements should be made. The establishment of the QA team and the creation of test item documents enabled QA and development engineers to align on 'correct behaviors and incorrect behaviors,' thus improving specifications and reducing rollbacks.
Mr. Azumi lists areas where there might be bugs and executes automated testing. For manual testing, we create test item documents for each case to understand the scope of modifications, but when there are unforeseen side effects, automated tests detect them, thus offering reassurance.
Since automated testing is a relatively new technology, not many possess both the knowledge and experience. Having those actively involved, such as Mr. Azumi and Mr. Fujiwara, share insights like 'do this in such cases,' 'integrate tests this way,' or 'execute at this timing,' has also helped accumulate valuable knowledge within our development team. Additionally, Mr. Azumi recently refactored our scenarios.

Azumi: As I mentioned, we created scenarios based on the CS-provided materials during implementation, prioritizing from zero to one with incomplete content. We refactored those, and redesign into scenarios reflecting actual user situations. We managed to reduce the number of test cases from about 70-80 to around 30.
While some may aim for perfection from the start, as you progress, you'll find areas to improve.  I believe prioritizing speed initially and then refactoring once stability is achieved is the best approach.

Horinouchi: Mr. Azumi’s suggestion to refactor has improved maintainability, so I'm glad we went through with it. Honestly, considering it's only been half a year since implementation, the results have been more than sufficient.

In Conclusion

Azumi: Combining manual and automated testing leverages the strengths of both, allowing for highly efficient testing. If you're wondering whether to automate, I recommend proceeding with it. However, it’s not about blindly implementing  automated testing. It's essential to carefully consider what to automate and how to maintain it.
As for tools, I recommend MagicPod for its low implementation barrier, affordability, and ease of operation. You can try it on a trial basis, and if it seems good, continue using it. Otherwise, you have the option to consider other tools or persevere with manual testing.
Let MagicPod handle repetitive defined testing procedures. Then, use the freed-up resources for creative testing, like difficult UI/UX or exceptional patterns, and functional testing of new features, which will further improve product quality.

Fujiwara: I also recommend MagicPod. In projects I've been involved in the past, human testers gradually became reluctant and careless, often leading to inaccurate verifications. Therefore, it's ideal to delegate tasks that humans find challenging to machines, and have humans conduct exceptional tests from a user's perspective.

Kajita: What's great about MagicPod is its ability to be manipulated from the outside via API. Not only can it perform automated testing, but its integration into deployment and release processes is also a strength.

Horinouchi: I consider automated testing extremely important, but it was challenging to get our team members to understand its significance. We've made efforts to raise awareness that for continuous product improvement, QA is essential to guarantee that it functions as expected and detects when it doesn't, and I believe our team has finally come to understand it. However, this isn't just limited to our team, as I think the entire industry still has a long way to go in grasping its importance. I hope MagicPod continues to excel as a leading company.

Ito: We will do our best. We hope to share the case of ASHITA-TEAM with more people and spread the importance of automation. Thank you for your time today.