一千萬個為什麽

搜索

我最近加入了自動化團隊。我們目前正在為我們的web-app開發一個自動化框架,該框架已經在現場環境中使用了一段時間。我設計並開發了框架來執行使用Sahi Open Source 4.4記錄的任何測試腳本。我堅持評估記錄GUI測試用例的方法。

測試中的應用程序:AUT是一個使用AJAX的Web應用程序,還有一些自定義圖形和報告。

可用的記錄方法如下

  • Recording Regression Bugs: There are around 3000 bugs which are closed/open in the system which can be automated and used as regression suite. The upfront problem here even after 3000 recorded flows, the automation may not cover the complete web-app.
  • Recording your business flows: This approach will record the business flow along with its validations. This will eventually cover the complete web-app if used data-driven testing smartly. But the web-app is complex and can have large number of flows even with data driven testing. We are also running on thin time-frame which might restrict this method. Also some of the flows are complex enough so that the script and its data driven support might become very complex and in-turn non-maintainable.
  • Recording Validations and Flows independently: This approach will record validations as a separate test case and the actual business use case as a separate test case. This will reduce my recorded scripts size and hence will be maintainable. Separation gives me an exact point of failure stating whether it is field validation fails or the business flow is failing.Hence my debugging of script failure. But I am not able to decide if this approach has any unseen issues.

我的疑惑/疑慮:

  • 我想知道我是否可以繼續使用方法3作為錄音 接近或應該有任何其他變化或新方法 共。我正在嘗試搜索其他內容的錄制方式 公司但無法得出結論。
  • 我想評估我的方法3是否能夠支持我們即將采用的敏捷方法。
  • 如果選擇Sahi Open Source的錄音工具對AJAX是否合適。
  • 錄制工具輸出是否應該作為腳本導出,或者應該像java代碼一樣導出為編程語言導出。

最佳答案

在這裏給出一個明確的答案是相當困難的,但我可以為你提供一些想法。

  • Go with maintainable as a first priority。 In my experience once a regression suite is up and running it can be a very long time before it goes away。
  • After maintainability, look to the 80/20 rule - the 20% of the application that gets 80% of the use。 This is where your highest regression ROI comes from。
  • After you have the 80/20 rule set up, look for gaps in your coverage。 This will show up as the areas where most of the regression bug reports come in from customers。 The manual testers will also know which parts of the application are most fragile。
  • Each time you add coverage to your regression tests, start with a smoke test of core functionality for the feature you're adding。 Then expand to the 80/20 rule。 After that you can consider adding tests for the bugs that have been reported against that feature。
  • Don't worry about the tool。 If it can reach the components on the page that it needs to reach, it's good enough for the task。
  • Be prepared for chaos during the adoption of agile。 I've yet to see an adoption of any agile implementation that didn't include a phase of utter chaos。 Your concern as for GUI automation is that you do not automate against a moving target: as a rule, GUI automation should be happening after each slice of functionality is stable。 Any attempt at GUI automation before this turns into thrashing (yes, I've been there)。 Whichever approach you choose will make no difference - the time needed to build GUI regression doesn't change because the development methodology changes。 On the plus side, if moving to a more agile approach adds unit testing, you should be able to reduce the amount of GUI regression that focuses on areas which are more properly the domain of unit tests。
  • Record/playback is dangerous。 I can't stress this enough。 Almost every tool sells itself on being "code-free", but the truth is sooner or later you will need to move to using the record feature to identify the components you need to interact with, and then code your interactions for maximum reusability and maintainability。
  • Output doesn't matter - reusability and maintainability does。 As long as you can run the regression on any system and spin up and spin down systems at will, it doesn't matter if you're exporting your tool output as scripts or as a compilable/compiled language export。

轉載註明原文: 比較GUI自動化記錄方法

猜你喜歡