-
Notifications
You must be signed in to change notification settings - Fork 478
add user interaction to resolve browser metric #1938
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: dev
Are you sure you want to change the base?
add user interaction to resolve browser metric #1938
Conversation
|
@hagrawal4 can you please add a link to the build where these test changes executed successfully? thanks |
7954ce7 to
e9cee29
Compare
|
I'm actually not expecting any measurable change in when Ready For User is triggered with the new dc-browser-metrics. There must be something wrong with the implementation if it's causing RFU to increase. Do we take a timestamp as soon as metric.end() is invoked? Why would that be changed by resolving every metric we include with the RFU data? |
| return wrapper | ||
|
|
||
|
|
||
| def measure_with_browser_metrics(interaction_name, webdriver, datasets, measure_func, post_metric_measure_func=None): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Good impl and metrics collection
The tests are collecting multiple data in a test including rfu. There is no increase in the rfu timing. measure_browser_navi_metrics is what caused the increase in timing earlier. It takes multiple attempts with 0.5 sleep to resolve browser metric. In the PR, we are now not considering the time to resolve browser metric but only the rfu time now More details here: https://atlassian.slack.com/archives/CFHUQ2YQP/p1764913016398969?thread_ts=1763443821.738109&cid=CFHUQ2YQP |
Type of Change
What changed?
Context (Why?)
dc-browser-metric v10.4.0 introduced a new LCP metric which requires user interaction to resolve browser metric. Performance tests are not able to resolve the browser metric which is causing performance regression
Here's the detailed analysis doc on performance regression: https://hello.atlassian.net/wiki/spaces/CSD/pages/6133557390/Performance+Testing+Challenges
Tests
DCAPT version comparison results:
9.2.10 vs 10.2.0-beta7 link, 10.2.0-beta7 has newer browser-metric v10.4.0
9.2.10 vs 10.2.0-rc1 link, 10.2.0-rc1 doesn't have newer browser-metric changes
Some of the non-dom actions > 10% p90_diff which is expected because the new logic is adding rfu time to the tests which is greater than(in some scenarios) the time to resolve browser metric(older browser metric without lcp)