Compatibility/Meetings/2018-12-work-week

From MozillaWiki
Jump to: navigation, search

This is the agenda and logistics page for Web Compatibility Team Work Week meeting in Orlando (USA) from December 3 - December 7, 2018.

Logistics

  • Location: Orlando, USA
  • Hotel: Dolphin and Swan Hotel, DisneyWorld

Attendees

Hotel Attendee Arrival Departure
Dolphin Hotel Adam Stevenson Monday, 19:33 Saturday, 07:00
Dolphin Hotel Dennis Schubert Monday, 17:55 Saturday, 20:20
Dolphin Hotel Guillaume Demesy Monday, 18:31 Saturday, 13:25
Dolphin Hotel Karl Dubost Monday, 21:39 Saturday, 06:45
Dolphin Hotel Kate Manning Monday, 17.55 Saturday, 20.20
Dolphin Hotel Mike Taylor Monday, early Saturday, early
Dolphin Hotel Tom Wisniewski Monday, 19:38 Saturday, 12:40
Dolphin Hotel Mariana Meireless Monday Saturday

Regrets

none.

Minutes

(Mostly scribed by Denschub)

OKR 2019H1 ( 👹 )

  • mike: (Going through all the things we did in 2018Q4).
  • mike: We still have stuff in progress. We are working on Blipz v2. Rozanna wants to try to staff the development in innovations for version 3. We are working on metrics stuff. I would love to have needstriage and sitewait, etc.
  • karl: it should not be very complicated, because we just listen to the milestones total number of issues.
  • mike: Track new top sites. It would start manually according to the formula. If we decide the numbers are interesting, we can create a software. We need to prepare a report at least once, and probably one update.
  • karl: after the prototype is done, we can decide what we would do with it and how much resource it takes.
  • mike: do you have time to finish the anonymous reporting channel?
  • karl: yes. 2019Q1 for evaluation.
  • mike: We need to figure out if we need to do more enterprise stuff/testing.
  • karl: most of the time, we didn't have strong usability issues.
  • mike: patch process. More work.
  • mike: About tinker tester
  • karl: too complex for me in the current situation.
  • mike: it's worth investing the time in tooling so we are more effective.
  • mike: figuring out how to duplicate issues. I don't know if we do this or not.
  • dennis: what eric started to do.
  • mike: it could be helpful to have the results.
  • karl: (disagree :) It's more complicated.)
  • adam: it could be an outreachy project
  • mike: site snapshots (saving all resources)
  • karl: good idea for the data, but bad idea for privacy. Probably would need to be researched before starting development. Survey on our current reporting.
  • karl: it could be very useful to have probes detecting some known issues (marfeel, fastclick, etc.) and adding probes little by little.
  • adam: https://github.com/mozilla/OpenWPM
  • mike: Do we keep pushing on Google Tier1 testing? Do we think it's useful?
  • karl: we need something to capture the work surrounding performance improvements to webcompat.com

OKR 2019H1 Ideas Bank ( 👹 )

  • More graphing/dashboarding/visualization work - webcompat metrics work.
    • graphs for different stages of Web Bugs (sitewait, etc)
    • integrate triage dashboard for Softvision folks (and contributors)
    • recording different milestones on server side /db
      • time estimate: 1 day
    • front-end refactor
      • time estimate: 1 day
    • front-end implementation
      • time estimate: 1 week
    • database stuff + refactoring
      • time estimate: 1 week
  • Track top sites by assigning a score based on the sites priority and the amount of reports, … (webcompat metric)
    • time estimate: 2 months
  • Evaluate the communication channel for anonymous reporting (follow up work to Q4 stuff)
    • time estimate: 1 day
  • Finish building the intervention rollout process for GoFaster
    • 1 month
  • about:compat VAPORWARE
    • we don’t know.
  • address Tinker Tester papercuts (Improvements: Evaluate what is useful, what we can improve, …) aka build dennis stuff.
    • time estimate: depends on scope. 1 week.
  • tinker chrome port:
    • time estimate: 1 week.
  • Duplicate Web Bug reports, or more general: a way to correlate issues more quickly/easily (GitHub beta?)
    • time estimate: NOBODY KNOWS
  • Research if Site Snapshots would be useful
    • time estimate: 2 days
  • console.log step 2 - Make it useful / usable
    • structure log
      • time estimate: 2 days
    • UI (front end)
      • time estimate: 2 weeks
    • privacy policy updates:
      • time estimate: 1 day
  • Leverage one telemetry probe, like for example tracking when FastClick is used (or at report-time)
    • time estimate: 2 months
  • Switching to python 3 for webcompat.com
    • time estimate: 1 week
  • Upgrade nginx
    • time estimate: 1 week
  • Participating to Lighthouse and/or webhint.io
  • Being more public about the core issues being fixed following webcompat issues (karl)
    • regular effort, 1 hour a week.
  • Performance improvement (outreachy) support by the team
    • time estimate: 3 months
  • migration of labels on webbugs issues, need to write software for that. make sure new issues are labelled correctly.
    • time estimate: 2 weeks
  • Analysis of marfeel websites to identify the type of issues for either fixing though site patching or contacting them
    • time estimate: 2 months
  • Creating a fastclick sitepatch with a whitelist of sites after investigating the issue.
    • time estimate: 1 month
  • Twitter bot for automatically posting a comment on the issue with the notice (after people have tweeted a link)
    • time estimate: 1 week
  • platform gap metrics work
    • time estimate: 1 month
  • follow up on tier 1 report card with improvements to the reporter
    • time estimate: 2 weeks

Vision and Alignment 2019 ( 👹 )

  • Mike: Platform will fix the bugs we deem as most important for webcompat. We need to be more efficient with Web Bugs to make that better.
  • Adam: Outreach is working fine IMO. I skip issues marked as Minor, and we probably need to be more agressive about re-pinging.
  • Karl: Outreach is very slow (and expensive), and sometimes, we may have better ways to fix sites (like Site Interventions). Outreach gets things done and the wb fixed, but it's really slow.
  • Mike: I agree that Site Inteverventions are important. We are working on a process to roll them out. These Interventions are not the long term solution, but a way to get our goal solved in the short term.
  • Mike: How's diagnosis going? Is it too much? What can we do better?
  • Karl: We still have too many bugs reaching the needsdiagnosis queue, but they don't need diagnosis. We need to be better at triage, and sometimes, we need to just close it if the issue looks not relevant enough.
  • Mike: I agree that for the next 6 to 12 months, we need to focus on the most important stuff. Karl (and Tom): Microsoft previously has closed those issues, but said they ackknowledge that there are issues, but they won't be working on it.
  • Karl: We also need to build tooling/make tooling better.

(not scribing individual people, but group contents)

  • Diagnosis is a lot of work. With Karl doing amazing work in prediagnosis, pings from Karl are complex and take a lot of time.
  • Maybe we should not ping people directly, but rather work in buckets, so that people can focus on that.
  • Backlog pressure
  • We may be too perfectionistic. It’s not bad to just throw an issue to the site’s developers without having the final diagnosis. If we have a general idea on what the issue may be, we could do outreach.
  • We could be more active in pinging other engineers to have them help us out. It might be more efficient if people working on affected components help out.
  • Let’s make a label to assign to web bugs that we cannot resolve, so The Management(tm) can triage those and help out.
  • Some combination of self direction and management might be useful for diagnosis
  • Diagnosis context switching (getting up to speed, finding the groove) vs. brain pain.
  • Weekly diagnosis reports are useful. Good way to know what’s happening.
  • Grumpiness may only be an illusion (re:grumpy core engineer).


Firefox Webcompat Strategy Meeting (mike)

Hosted by Mike Taylor, many participants (can’t keep track of who’s speaking)

  • Google Tier1 search for mobile (win)
  • Fixed 15+core interop bugs webcompat p1,p2,p3 (win)
  • Interop vs Web compatibility
  • Improve broad web api compatibility with Blink, closing the webcompat gap
    • We don’t have infinite resources, we should be strategic with our efforts
  • Web Compat is hard to measure but we are making progress in this area (thanks Tim)
  • Review of a compat bugs lifecycle
    • Standards and specs >
    • Api’s >
    • Gecko Platform >
      • Web Platform Tests
    • Firefox ships a release >
    • Web Devs >
    • Web Users
      • Web Compatibility bugs
  • Talk about what happens when a spec implemntation happens well. CSS Grid is a good example that’s newer
  • Many of the Webcompat bugs on the P1/P2 list has been old things that weren’t spec’d properly or at all
    • If we had been paying attention to tests that were breaking in the past, would we have known
    • In the case of window.event we knew and made a decision not to implement
    • Now that we have tests for what’s not working between browsers, do we know what’s being used and important?
    • We’re working on this fearture are the tests showing potential issues with implementations. We can link it to a webcompat issue after.
    • What metric can we track to know that comapt issues will occur?
      • Where there are gaps between Gecko and Webkit / Blink and try to rank the severity of it
    • Is there a way to surface the usage data along side the compat issues / platform tests?
    • Microsoft has really good usage data on this. The challenge is it will show 0.136% of the internet uses this feature, but that could include Facebook so you can’t remove it.
  • Measurement spectrum
    • What browsers implement
    • What sites plan to enable
    • What site have enabled
    • (didn’t see this point)
  • Platform Gap Metric Proposal (proposal #1)
    • Take all the Chrome status use counters, weight each one by it’s usage
    • This data is noisey
    • We are taking a closer look into this
    • You could implement all the features that Chrome has, but may not fix your problem
  • Webcompat Metric (proposal #2)
    • We want to know that what we are working on in the platform is fixing real websites
    • Site Compat Index
      • Complex equation
      • Websites position in the top-200 list
      • Web compat reports affecting website
      • Number of duplicate reports
      • Priority and product weighting factor, indicating strategic importance
    • Total Site Compat Index
      • Complex equation
      • Combines all the individual ratings of the Top 200 list
  • We have a metric in our DX 3 year vision to track the number of top 100 sites that give us a tier 1 experience
  • In 2019 we want to close the webcompat gap
  • Stuck bugs
    • Sometimes we can caught on diagnosing website

Questions?

  • Do we have any idea of test suites that are missing pieces and causing webcompat breakage?
    • Sometimes it’s compounded issues
  • Do we know which areas are on this?
    • We know there are a lot of issues with CSS and events (I think Karl said), possibly scrolling
  • How do we take the gaps we find in bugs and highlight them in web platform tests?
    • We can look back on bugs with compat type labels or Bugzilla dupes


Continued Vision and Alignment 2019 ( 👹 )

Attendees: Webcompat team (not scribing individual people, but group contents)

Metrics discussion

(Webcompat Metric proposal #2) * https://docs.google.com/document/d/1oAvIkGVM3HKUAumI4K_qV315ujCoeMbG_Z9aUBYBecw/edit * Concerned that our metric will not reflect the severity, or impact of the reports to users * 2 severity critical bugs may be more important that 20 severity minor * Going to fill this out and try to understand how that would work

About:compat page

  • Need a decision if we can deliver it fast enough
    • Does product want it?

Agenda

Monday 3

Arrival day

  • 09:00 - 17:00 People arriving all day long. Time to sleep, relax, walk around, meet if you wish. Tomorrow is the big day of work.
  • 18:00 - 21:00 Welcome Reception

Tuesday 4

  • 09:00 - 10:30 Mozlando Plenary
  • 11:00 - 12:00 Defining the agenda for the week using the topics bank. We vote on what we prefer/priorities.
  • 12:00 - 13:00 Outreachy lunch (for outreachy participants and mentors)
  • 13:00 - 15:00 Outreachy onboarding (for outreachy participants)
  • 13:00 - 15:00 H1 (Q1 & Q2) OKR Planning
  • 15:30 - 17:00 Engineering Lightning talks
  • 17:00 - 18:30 Down time / free time
  • 18:30 Hop on bus to Disney Springs for team dinner
  • 19:15 Team dinner at The Boathouse (click to see menu) Note: we have a reservation for 8 people, which is everyone on the immediate team.

Wednesday 5

  • 08:00 - 08:30 WebReplay demo and chat
  • 08:30 - 09:30 Compat/DevTools Diagnosis Party - Visuals Homeroom. (organized by Harald Kirschner)
    • Getting together to do some hands-on diagnosis for webcompat reports: https://webcompat.com/issues?stage=needsdiagnosis
    • Outcomes:
      • Successfully diagnose some issues, of course
      • Better shared understanding of web compat use cases, pain points and general experience; maybe commenting/filing some devtools issues
      • Quality DX team time
  • 09:30 - 10:00 Vision + Alignment 2019
  • 10:00 - 10:45 Measuring in 2019
  • 10:45 - 12:00 Hard to diagnose bugs
  • 12:00 - 13:00 Lunch
  • 13:00 - 17:00 Small groups time (the "management" will be in a training...)
  • 17:00 - 18:30 Free time
  • 18:30 - 22:30 Evening: All company evening event at Kennedy Space Center

Thursday 6

  • 09:00 - 10:00 Platform and Web Compat Strategy
  • 10:15 - 11:15 Developer Experience All Hands
  • 11:30 - 12:00 about:compat
  • 12:00 - 13:00 Lunch
  • 13:00 - 14:00 stalled needsdiagnosis
  • 14:30 - 15:30 Thomas or Dennis (out of the house)
  • 15:30 - 17:30 neesdinfo check party
  • 18:00 - whenever. Dinner on your own. Feel free to coordinate with others, or have a quiet evening alone.

Friday 7

  • 09:00 - 12:00 Hacking, twitter bot, weekly diagnosis report, tinker tester
  • 12:00 - 13:00 Lunch
  • 13:00 - 15:00 Hacking
  • 15:00 - 16:00 Finalize OKRs
  • 16:15 - 17:00 Exec Q&A for Runtime and Visuals teams
  • 19:00 - 24:00 Closing event at The Wizarding World of Harry Potter at Universal Studios


Topics Bank

Use the following template in one of the section below:

 ==== Topic ====
 Description with [http://example.com links] when necessary


Measuring Compat in 2019

An overview and discussion of 2 proposed metrics for measuring progress in 2019.


2019 Vision and Alignment

Let's make sure we're working on the most important things for this year. People are asking us what are the priorities.

Hard to Diagnose bugs

Discussion around the collected Hard to Diagnose Bugs.

Work/Dev environment for webcompat-metrics

Karl, Guillaume and Kate need to work together on harmonizing the dev/local environment and probably document it, so we avoid issues.

Define OKRs for 2019H1

To be sure to be ready to work on first day of 2019H1, we need to have defined OKRs, specifically 2019Q1. We might want to predefine ideas of OKRs for 2019Q2.

Bot monitoring twitter for webcompat issues notifications

Users who are sending an anonymous report are often saying they submitted the issue on twitter. How do we track that? Should we make it visible in the bug? (privacy/opacity borderline by linking contexts).

FIXED IT

Core engineers are doing an awesome work at helping us and fixing the webcompat issues. We need to make this more visible. Probably a weekly/monthly blog post on the style of karl's lightning talk "FIXED IT".

Web Compatibility Improving Performance

Working with the outreachy candidate on the Performance project for webcompat. (Mike and Karl). Defining the steps. What should be kept for the project, what should be cut. So we don't have a too broad scope.

Review of Tinker-Tester

Dennis, Karl, Thomas will discuss about the current features and how to add/remove of them or anything related to the UX/design.

Improve the console.log reporting by the Reporter extension

Thomas. Karl (maybe Mike) See Discussions in issue 2659 Also https://bugzilla.mozilla.org/show_bug.cgi?id=1510073 So currently the reporter extension is sending JSON to the form which is then stored after being petrified as a string to be held into a hidden field. That's not very satisfying. A couple of silly ideas.

  • git commit (by webcompat-bot) a structured JSON of the console.logs into web-bugs repo.
  • upload the JSON the same way we do with images (create a link to the JSON file) which is added to the details.
  • Create a gist on webcompat bot account with the JSON.

SeeAlso https://bugzilla.mozilla.org/show_bug.cgi?id=1510067 and https://bugzilla.mozilla.org/show_bug.cgi?id=1510063

about:compat discussion

Discussion on the about:compat page, things that should be included, how we can implement and ship it, ... Harald is interested by the discussion.

Stalled needsdiagnosis from other vendors

We start to have a growing collections of issues not taken care of by other vendors.

  • We could in graph separate them. (issues with multiple labels). At least for Mozilla that would show a better reality of the work being done.
  • We could try to wake up the relationships. Another webcompat summit? With more maturity now. 3 years later.
  • Ask Microsoft about their Edge issues, given the news about Chromium for MS.

Stalled needsdiagnosis for certain labels

Some of these are not being taken care of. This is useless if we do not do anything about them.

  • Marfeel
  • GWS
  • Reality browser VR


To diagnose while in USA


Check-our-current-needsinfo party

Everyone of us we are going through the current personal needsinfo and we make an assessment of what's the next step? When we can commit to an effort on it? Or if we just pass.

weekly diagnosis report

is it useful? who is reading it?


*Ad Hoc Topic Additions*

Metrics Database

Let's talk about how we're saving our metrics data and whether the current framework makes the most sense going forward