Skip to content
  • rniwa@webkit.org's avatar
    Record subtest values in Dromaeo tests · 081e85fe
    rniwa@webkit.org authored
    https://bugs.webkit.org/show_bug.cgi?id=124498
    
    Reviewed by Andreas Kling.
    
    PerformanceTests: 
    
    Made Dromaeo's test runner report values in DRT.progress via newly added PerfTestRunner.reportValues.
    
    * Dromaeo/resources/dromaeorunner.js:
    (.): Moved the definition out of DRT.setup.
    (DRT.setup): Ditto.
    (DRT.testObject): Extracted from DRT.setup. Set the subtest name and continueTesting.
    continueTesting is set true for subtests; i.e. when name is specified.
    (DRT.progress): Call PerfTestRunner.reportValues to report subtest results.
    (DRT.teardown): Call PerfTestRunner.reportValues instead of measureValueAsync.
    
    * resources/runner.js: Made various changes for newly added PerfTestRunner.reportValues.
    (.): Moved the initialization of completedIterations, results, jsHeapResults, and mallocHeapResults into
    start since they need to be initialized before running each subtest. Initialize logLines here since we
    need to use the same logger for all subtests.
    (.start): Initialize the variables mentioned above here. Also respect doNotLogStart used by reportValues.
    (ignoreWarmUpAndLog): Added doNotLogProgress. Used by reportValues since it reports all values at once.
    (finish): Compute the metric name such as FrameFrame and Runs from unit. Also don't log or notify done
    when continueTesting is set on the test object.
    (PerfTestRunner.reportValues): Added. Reports all values for the main/sub test.
    
    Tools: 
    
    Supported parsing subtest results.
    
    * Scripts/webkitpy/performance_tests/perftest.py: Replaced _metrics with an ordered list of subtests, each of
    which contains a dictionary with its name and an ordered list of subtest's metrics.
    (PerfTest.__init__): Initialize _metrics as a list.
    (PerfTest.run): Go through each subtest and its metrics to create a list of TestMetrics.
    (PerfTest._run_with_driver):
    (PerfTest._ensure_metrics): Look for a subtest then a metric in _metrics.
    
    * Scripts/webkitpy/performance_tests/perftest_unittest.py:
    (TestPerfTest._assert_results_are_correct): Updated the assertions per changes to _metrics.
    (TestPerfTest.test_parse_output): Ditto.
    (TestPerfTest.test_parse_output_with_subtests): Added the metric and the unit on each subtest result as well as
    assertions to ensure subtest results are parsed properly.
    (TestReplayPerfTest.test_run_with_driver_accumulates_results): Updated the assertions per changes to _metrics.
    (TestReplayPerfTest.test_run_with_driver_accumulates_memory_results): Dittp.
    
    * Scripts/webkitpy/performance_tests/perftestsrunner.py:
    (_generate_results_dict): When the metric for a subtest is processed before that of the main test, the url is
    incorrectly suffixed with '/'. Fix this later by re-computing the url with TestPerfMetric.test_file_name when
    adding new results.
    
    * Scripts/webkitpy/performance_tests/perftestsrunner_integrationtest.py:
    (TestWithSubtestsData): Added.
    (TestDriver.run_test):
    (MainTest.test_run_test_with_subtests): Added.
    
    LayoutTests: 
    
    Rebaselined the test.
    
    * fast/harness/perftests/runs-per-second-log-expected.txt:
    
    
    git-svn-id: http://svn.webkit.org/repository/webkit/trunk@159805 268f45cc-cd09-0410-ab3c-d52691b4dbfc
    081e85fe