COMAQA Winter 2017 Сonference is fully dedicated to manual and automated QA.
The conference is being organized by the COMAQA.BY community that unites manual and automation QA engineers, developers, devops, managers and other IT specialists in context of manual and automated QA.
Activist of COMAQA.BY and CoreHard.BY communities, co-founder of DPI.Solutions, manager at EPAM Systems. More than 13 years of experience in IT. Specializes in low-level development, QA automation, management, sales.
Filipp has a wide experience in mobile software development: from enterprise applications to graphically intensive video games. He is an active participant of young mobile gamedev community trying to bring such professional techniques as lean development and test automation. Filipp is a frequent speaker on tech conferences around Europe and Russia. Currently working in Creative Mobile company.
From Novosibirsk. Scientific experience, including work at the Center of Virusology and Biotechnology “Vector”. More than 5 years of experience in manual and automated testing, mostly in the field of finances. “Favorite” language – Python. EPAM Systems and DPI.Solutions employee, the teacher of the foundations of automated testing.
10 years in IT: from manual to developing and introduction of test automation framework, CI, CD. System engineer at EPAM Systems.
Junior QA Automation Engineer @Applied Systems Ltd.
8 years in IT. Primary areas: software development and automated testing. Architect and key developer of ReportPortal.
Joined EPAM in 2007. With experience in back end and front end, brought the idea of development and best practices into automation frameworks at Low Level Programming Department. From Junior Software Engineer to a Project Manager. Now heads up Solution Accelerators implementation and delivery at TCC, to differentiate EPAM at competitive stages, pushing EPAM solutions in Open Source
I have experience in automation about 5 years, now i'm working with Appium for Android and iOS
Speaker is being clarified
- What does automated tests give to the “common” test engineers? Whether we work in style of “a girl was taken out, an automated system was introduced”?
- Can the testing engineers trust the result of automated tests and should they do it? And what to do if there is no trust?
And as of course:
- Who is guilty and what to do?
Software testing is highly growing trend. And I want to look a little bit in future. How Testing should looks like 2020. On what directions we Test Engineers should concentrate? What goals we should force and what we can get while rich this point? Enjoy the future now
3. Architectural patterns
4. Specific for QA Automation DP’s
5. General purpose DP’s
6. Ho to make a right choice
7. Statefull and stateless solutions
8. Comparative analyses
9. Wrappers as a one of key points
10. Comparative analyses
11. Using cloud features: scalability, e.g.
12. Mobile: nuances if emulators or/and real devices usage
13. Parallelization and multithreading as part of TA;
14. Popularity of OS related tasks – working with OS file system, time, remote connection, WIN32 api, etc.
Implementing automation - it’s also a project, which has goals and objectives; should evolve on existing resource base and
taking to consideration“external” factors and risks. Launch such kind of a project, create clear procedures and regulations,
which will become the basis for all the next automation projects - the idea that many IT companies are wishing to fulfil.
In this round-table discussion we are trying to cover such kind of questions:
1) What should we begin with: learning the project under test? or customer wishes? or team abilities?
- selecting team members (how many people, which skills required, how do we select them)
- distributing tasks and creating fields of responsibility
- organizational questions (vacations, new team member etc)
- featured of distributed team
- interaction with other teams (developers, DevOps, “manual engineers”, business)
3) Choosing coverage and test types for automation: regression, functional, performance etc.
4) Selecting frameworks, tools and etc
5) Licensing, “open-source” tools
6) Selecting architectural solution
7) Arrangement on coding styles, code review process, test method namings etc
8) Arrangements on preparation of test data, configuration of test environment, test runs, test results analysis
9) Metrics (what do we collect and when we start collecting)
10) Documenting process of implementing automation, workers survey
11) Software development methodology impact on the start and development of automation project
12) Impact of development stage on start of automated testing
13) Differences in automation start for web, mobile and desktop products.
We're going to talk about:
1. Running tests in cloud: server-side vs client-side execution.
2. A bunch of devices and frameworks for TestDroid, TestObject.
3. Available actions and limitations in work with devices in cloud.
4. Integration of the project based on Appium using client-side execution & API for cloud service.
5. Integration with hockeyApp.
6. How did we test and what happened - personal experience.
We will speak about a subset of 6 possible relevant automation “laws” from different fields of study in chronological order, and their effect for IT, looked through the prism of QA/QC. Then we will try to connect everything in one “scientific” picture of the world of automation. And of course we will prepare a lot of material for the further understanding and studying, deeper understanding and practical use in our everyday work. Material is based on a many-years experience in managing and lecturing of the reporter and might be useful for everybody: from the young specialist to IT-mastodon, at least as a way for systematization practical experience. We will be talking in a “near-scientific” way, but “on fingertips” and guarantee to be understandable and useful for the widest circle of listeners, and will be directed to understanding of IT-processes , the basis of the conscious success in the sphere
In the conditions of the exploration and customization of open source software, business analytics formulate the high-level requirements. Disclosing of the functionality details appears during development or after customer's feedback. Testers in this situation get poor requirements and disappoints from developers with testers ask "how it's working" too frequently. How we solved this problem - will be told in my report on real example.
Denis and Zhenya will tell to everybody their stories about moving to Minsk: what is ok, what is not, and does it worth it, why the guys decided to take such radical change and what do they get as result. Common questions “Who is guilty” and “What to do”
We will talk about unjustified career growth of the testing engineer and potentials problems, which might appear on his route, talk about personal experience of the reporter and will try to make conclusions: how to remain the only testing engineer – “survived”, get a huge project with the ton of services and do not go mad; how to combine testing of new functionality, supporting builds, testing environment and controlling release process; how not to find skillful tester for a team in a year and successfully “integrate” and accept non-it specialist.
Potential complexity in testing of web services. We will talk about the foundations of API testing, the reporter will share his experience in creating framework for effective automation organization of regression testing from the scratches in tight time-limits. I’m sure those ideas and practical experience of the reporter will not be useless for the listener, especially for those who might meet the web services testing on new possible spot.
Postman is a great tool for API automation, but it requires additional software to be installed, additional skills to be applied, additional configurations to be made. Is there a way for an expert in Java QA Automation to jump over all these limitations and get straight to automating API tests in familiar environment? Sure there is! Let's talk about ways to automate API tests using Java