Category: Mozilla

The Mozilla Blog: Firefox Team Looks Within to Lead Into the Future

The Mozilla Blog: Firefox Team Looks Within to Lead Into the Future

For Firefox products and services to meet the needs of people’s increasingly complex online lives, we need the right organizational structure. One that allows us to respond quickly as we continue to excel at delivering existing products and develop new ones into the future.

Today, I announced a series of changes to the Firefox Product Development organization that will allow us to do just that, including the promotion of long-time Mozillian Selena Deckelmann to Vice President, Firefox Desktop.

“Working on Firefox is a dream come true,” said Selena Deckelmann, Vice President, Firefox Desktop. “I collaborate with an inspiring and incredibly talented team, on a product whose mission drives me to do my best work. We are all here to make the internet work for the people it serves.”

The Mozilla Blog: Firefox Team Looks Within to Lead Into the Future 1

Selena Deckelmann, VP Firefox Desktop

During her eight years with Mozilla, Selena has been instrumental in helping the Firefox team address over a decade of technical debt, beginning with transitioning all of our build infrastructure over from Buildbot. As Director of Security and then Senior Director, Firefox Runtime, Selena led her team to some of our biggest successes, ranging from big infrastructure projects like Quantum Flow and Project Fission to key features like Enhanced Tracking Protection and new services like Firefox Monitor. In her new role, Selena will be responsible for growth of the Firefox Desktop product and search business.

Rounding out the rest of the Firefox Product Development leadership team are:

Joe Hildebrand, who moves from Vice President, Firefox Engineering into the role of Vice President, Firefox Web Technology. He will lead the team charged with defining and shipping our vision for the web platform.

James Keller who currently serves as Senior Director, Firefox User Experience will help us better navigate the difficult trade-off between empowering teams while maintaining a consistent user journey. This work is critically important because since the Firefox Quantum launch in November 2017 we have been focused on putting the user back at the center of our products and services. That begins with a coherent, engaging and easy to navigate experience in the product.

I’m extraordinarily proud to have such a strong team within the Firefox organization that we could look internally to identify this new leadership team.

These Mozillians and I, will eventually be joined by two additional team members. One who will head up our Firefox Mobile team and the other who will lead the team that has been driving our paid subscription work. Searches for both roles will be posted.

Alongside Firefox Chief Technology Officer Eric Rescorla and Vice President, Product Marketing Lindsey Shepard, I look forward to working with this team to meet Mozilla’s mission and serve internet users as we build a better web.

You can download Firefox here.

The post Firefox Team Looks Within to Lead Into the Future appeared first on The Mozilla Blog.

Daniel Stenberg: Coming to FOSDEM 2020

Daniel Stenberg: Coming to FOSDEM 2020

I’m going to FOSDEM again in 2020, this will be my 11th consecutive year I’m travling to this awesome conference in Brussels, Belgium.

At this my 11th FOSDEM visit I will also deliver my 11th FOSDEM talk: “HTTP/3 for everyone“. It will happen at 16:00 Saturday the 1st of February 2020, in Janson, the largest room on the campus. (My third talk in the main track.)

Daniel Stenberg: Coming to FOSDEM 2020 2

For those who have seen me talk about HTTP/3 before, this talk will certainly have overlaps but I’m also always refreshing and improving slides and I update them as the process moves on, things changes and I get feedback. I spoke about HTTP/3 already at FODEM 2019 in the Mozilla devroom (at which time there was a looong line of people who tried, but couldn’t get a seat in the room) – but I think you’ll find that there’s enough changes and improvements in this talk to keep you entertained this year as well!

If you come to FOSDEM, don’t hesitate to come say hi and grab a curl sticker or two – I intend to bring and distribute plenty – and talk curl, HTTP and Internet transfers with me!

You will most likely find me at my talk, in the cafeteria area or at the wolfSSL stall. (DM me on twitter to pin me down! @bagder)

Daniel Stenberg: Coming to FOSDEM 2020 3

Patrick Cloke: Cleanly removing a Django app (with models)

While pruning features from our product it was necessary to fully remove some
Django apps that had models in them. If the code is just removed than the tables
(and some other references) will be left in the database.

After doing this a few times for work I came up …

Patrick Cloke: Using MySQL’s LOAD DATA with Django

While attempting to improve performance of bulk inserting data into MySQL
database my coworker came across the LOAD DATA SQL statement. It allows you
to read data from a text file (in a comma separated variable-like format) and
quickly insert it into a table. There’s two variations of it …

Mozilla VR Blog: Hello WebXR

Mozilla VR Blog: Hello WebXR

Hello WebXR

We are happy to share a brand new WebXR experience we have been working on called Hello WebXR!

Here is a preview video of how it looks:

We wanted to create a demo to celebrate the release of the WebXR v1.0 API!.

The demo is designed as a playground where you can try different experiences and interactions in VR, and introduce newcomers to the VR world and its special language in a smooth, easy and nice way.

How to run it

You just need to open the Hello WebXR page on a WebXR (Or WebVR thanks to the WebXR polyfill) capable browser like Firefox Reality or Oculus Browser on standalone devices such as the Oculus Quest, or with Chrome on Desktop >79. For an updated list of supported browsers please visit the ImmersiveWeb.dev support table.

Features

  • The demo starts in the main hall where you can find:
  • Floating spheres containing 360º mono and stereo panoramas
  • A pair of sticks that you can grab to play the xylophone
  • A painting exhibition where paintings can be zoomed and inspected at will
  • A wall where you can use a graffiti spray can to paint whatever you want
  • A twitter feed panel where you can read tweets with hashtag #hellowebxr
  • Three doors that will teleport you to other locations:
    • A dark room to experience positional audio (can you find where the sounds come from?)
    • A room displaying a classical sculpture captured using photogrammetry
    • The top of a building in a skyscrapers area (are you scared of heights?)

Goals

Our main goal for this demo was to build a nice looking and nice performing experience where you could try different interactions and explore multiple use cases for WebXR. We used Quest as our target device to demonstrate WebXR is a perfectly viable platform not only for powerful desktops and headsets but also for more humble devices like the Quest, where resources are scarce.

Also, by building real-world examples we learn how web technologies, tools, and processes can be optimized and improved, helping us to focus on implementing actual, useful solutions that can bring more developers and content to WebXR.

Tech

The demo was built using web technologies, using the three.js engine and our ECSY framework in some parts. We also used the latest standards such as glTF with Draco compression for models and Basis for textures. The models were created using Blender, and baked lighting is used throughout all the demo.

We also used third party content like the photogrammetry sculpture (from this fantastic scan by Geoffrey Marchal in Sketchfab), public domain sounds from freesound.org and classic paintings are taken from the public online galleries of the museums where they are exhibited.

Conclusions

There are many things we are happy with:

  • The overall aesthetic and “gameplay” fits perfectly with the initial concepts.
  • The way we handle the different interactions in the same room, based on proximity or states made everything easier to scale.
  • The demo was created initially using only Three.js, but we successfully integrated some functionality using ECSY.

And other things that we could improve:

  • We released fewer experiences than we initially planned.
  • Overall the tooling is still a bit rough and we need to keep on improving it:
    • When something goes wrong it is hard to debug remotely on the device. This is even worse if the problem comes from WebGL. ECSY tools will help here in the future.
    • State of the art technologies like Basis or glTF still lack good tools.
  • Many components could be designed to be more reusable.

What’s next?

  • One of our main goals for this project is also to have a sandbox that we could use to prototype new experiences and interactions, so you can expect this demo to grow over time.
  • At the same time, we would like to release a template project with an empty room and a set of default VR components, so you can build your own experiments using it as a boilerplate.
  • Improve the input support by using the great WebXR gamepads module and the WebXR Input profiles.
  • We plan to write a more technical postmortem article explaining the implementation details and content creation.
  • ECSY was released after the project started so we only used it on some parts of the demo. We would like to port other parts in order to make them reusable in other projects easily.
  • Above all, we will keep investing in new tools to improve the workflow for content creators and developers.

Of course, the source code is available for everyone. Please give Hello World! a try and share your feedback or issues with us on the github repository.

Rubén Martín: Modernizing communities – The Mozilla way

It’s been a long time since I’ve wanted to write deeply about my work empowering communities. I want to start with this article sharing some high-level learnings around working on community strategy.

Hi, I’m Rubén Martín and I work as a Community Strategist for Mozilla, the non-profit behind Firefox Browser.

I’ve been working with, and participating in open/libre source communities since 2004 – the first decade as a volunteer before making open source my full time career –  joining Mozilla as staff five years ago, where as part of my Community Strategist role I’ve been focused on:

  • Identifying and designing opportunities for generating organizational impact through participation of volunteer communities.
  • Design open processes for collaboration that  provide a nice, empowering and rich experience for contributors.

During these many years, I have witnessed incredible change in how communities engage, grow and thrive in the open source ecosystem and beyond, and ‘community’ has become a term more broadly implicated in product marketing and its success in many organizations. At Mozilla our community strategy,  while remaining dedicated to the success of projects and people, has been fundamentally modernized and optimized to unlock the true potential of a Mission-Driven Organization by:

  • Prioritizing data, or what I refer to as ‘humanizing the data-driven approach’ 
  • Implementing our Open By Design strategy. 
  • Investing into our contributor experience holistically.

Today Mozilla’s communities are a powerhouse of innovation, unlocking much more impact to different areas of the organization, like the Common Voice 2019 Campaign where we collected 406 hours of public domain voices for unlocking the speech-to-text market or the Firefox Preview Bug Hunter Campaign with more than 500 issues filed and 8000 installs in just two weeks that were fundamental to launch this browser to the market sooner.

Follow me during this post to know how we got there.

The roots, how did we get here?

<figcaption>“IMG_2255” by Janellie is licensed under CC BY-NC-ND 2.0</figcaption>

Mozilla has grown from a small community of volunteers and a few employees, to an organization with 1200+ employees and tens of thousands of volunteers around the world. You can imagine that Mozilla and its needs in 2004 were completely different to the current ones.

Communities were created and organized organically for a long time here at Mozilla. Anyone with the time and energy to mobilize a local community created one and tried to provide value to the organization, usually through product localization or helping with user support.

I like a quote from my college Rosana Ardila who long ago said that contributing opportunities at Mozilla were a “free buffet” where anyone could jump into anything with no particular order of importance and where many of the dishes were long empty. We needed to “refill” the buffet.

This is how usually a lot of libre and open source communities operate, being open by default with a ton of entry points to contribute, unfortunately some or most of them not really fully optimized for contributions and therefore don’t offer a great contribution experience.

Impact and humanizing the data-driven approach

<figcaption>“Interview with David LePôle for Databit.me” by hellocatfood is licensed under CC BY-NC-SA 2.0</figcaption>

Focusing on impact and at the same time humanizing the data-driven approach were a couple of fundamental changes that happened around 4-5 years ago and completely changed our approach to communities.

When you have a project with a community around there are usually two fundamental problems to solve:

  1. Provide value to the organization’s current goals.
  2. Provide value to the volunteers contributing *

If you move the balance too much into the first one, you risk your contributors to become “free labor”, but if you balance too much into the other direction, your contributor efforts are highly likely to become irrelevant for the organization.

* The second point is the key factor to humanize the approach, and something people forget when using data to make decisions: It’s not just about numbers, it’s also people, human beings!

How do you even start to balance both?

RESEARCH!

Any decision you take should be informed by data, sometimes people in charge of community strategy or management have “good hunches” or “assumptions”, but that’s a risky business you need to avoid, unless you have data to support it.

Do internal research to understand your organization, where it is heading, what are the most important things and the immediate goals for this year. Engage into conversations to understand why these goals are important with key decision makers.

Do internal research to also understand your communities and contributors, who they are, why they are contributing (motivation), where, how? Both quantitatively (stats from tools) as well as qualitatively (surveys, conversations).

This will provide you with an enormous amount of information to figure out where are the places where impact can be boosted and also understand how your communities and contributors are operating. Are they aligned? Are they happy? If not, why?

Do also external research to understand how other similar organizations are solving the same problems, get out of your internal bubble and be open learning from others.

A few years ago we did all of this at Mozilla from the Open Innovation team, and it really informed our strategy moving forward. We keep doing internal and external research regularly in all of our projects to inform any important decisions.

Open by default vs open by design

<figcaption>“Sometimes Open Needs a Push” by cogdogblog is licensed under CC0 1.0</figcaption>

I initially mentioned that being open by default can lead to a poor contributor experience, which is something we learned from this research. If you think your approach will benefit from being open, please do so with intention, do so by design.

Pointing people to donate their free time to suboptimal contributor experiences will do more harm than good. And if something is not optimized or doesn’t need external contributions, you shouldn’t point people there and clarify the expectations with everyone upfront.

Working with different teams and stakeholders inside the organization is key in order to design and optimize impactful opportunities, and this is something we have done in the past years at Mozilla in the form of Activate Campaigns, a regular-cadence set of opportunities designed by the community team in collaboration with different internal projects, optimized for boosting their immediate goals and optimized to be an engaging and fun experience for our Mission-Driven contributors.

The contributor experience

In every organization there is always going to be a tension between immediate impact and long term sustainability, especially when we are talking about communities and contributors.

Some organizations will have more room than others to operate in the long term, and I’m privileged to work in an organization that understands the value of long term sustainability.

If you optimize only for immediate value, you risk your communities to fall apart in the medium term, but if you optimize only for long-term you risk the immediate success of the organization.

Find the sweet-spot between both, maybe that’s 70-30% immediate-long or 80-20%, it’s really going to depend on the resources you have and where your organization is right now.

The way we approached it was to always have relevant and impactful (mesurable) opportunities for people to jump into (through campaigns) and at the same time work on the big 7 themes we found we needed to fix as part of our internal research.

I suspect these themes are also relevant to other organizations, I won’t go into full details in this article but I’d like to list them here:

  • Group identities: Recognize and support groups both at regional and functional level.
  • Metrics: Increase your understanding of the impact and health of your communities.
  • Diversity and inclusion: How do you create processes, standards and workflows to be more inclusive and diverse?
  • Volunteer leadership: Shared principles and agreements on volunteer responsibility roles to have healthier and more impactful communities.
  • Recognition: Create a rewarding contributor experience for everyone.
  • Resource distribution: Standards and systems to make resource distribution fair and consistent across the project.
  • Contributor journey and opportunity matching: Connect more people to high impact opportunities and make it easy to join.

Obviously this is something you will need a strong community team to move forward, and I was lucky to work with excellent colleges at the Mozilla Community Development Team on this: Emma, Konstantina, Lucy, Christos, George, Kiki, Mrinalini and a ton of Mozilla volunteers over the Reps program and Reps Council.

You can watch a short video of the project we called “Mission-Driven Mozillians” and how we applied all of this:

What’s next?

I hope this article has helped you understand how we have been modernizing our community approach at Mozilla, and I also hope this can inspire others in their work. I’ve been personally following this approach in all the projects I’ve been helping with Community Strategy, from Mission-Driven Mozillians, to Mozilla Reps, Mozilla Support and Common Voice

I truly believe that having a strong community strategy is key for any organization where volunteers play a key role, and not only for providing value to the organization or project but also to bring this value back to the people who decided to donate their precious free time because they believe in what you are doing.

There is no way for your strategy to succeed in the long term if volunteers don’t feel and ARE part of the team, working together with you and your team and influencing the direction of the project.

Which part of my work are you most interested in so I can write next in more detail?

Feel free to reach out to me via email (rmartin at mozilla dot com) or twitter if you have questions or feedback, I also really want to know and hear from others solving similar problems. 

Thanks!

The Mozilla Blog: ICANN Directors: Take a Close Look at the Dot Org Sale


As outlined in two previous posts, we believe that the sale of the nonprofit Public Interest Registry (PIR) to Ethos Capital demands close and careful scrutiny. ICANN — the body that granted the dot org license to PIR and which must approve the sale — needs to engage in this kind of scrutiny.

When ICANN’s board meets in Los Angeles over the next few days, we urge directors to pay particular attention to the question of how the new PIR would steward and be accountable to the dot org ecosystem. We also encourage them to seriously consider the analysis and arguments being made by those who are proposing alternatives to the sale, including the members of the Cooperative Corporation of .ORG Registrants.

As we’ve said before, there are high stakes behind this sale: Public interest groups around the world rely on the dot org registrar to ensure free expression protections and affordable digital real estate. Should this reliance fail under future ownership, a key part of the public interest internet infrastructure would be diminished — and so would the important offline work it fuels.

Late last year, we asked ISOC, PIR and Ethos to answer a series of questions about how the dot org ecosystem would be protected if the sale went through. They responded and we appreciate their engagement, but key questions remain unanswered.

In particular, the responses from Ethos and ISOC proposed a PIR stewardship council made up of representatives from the dot org community. However, no details about the structure, role or powers of this council have been shared publicly. Similarly, Ethos has promised to change PIR’s corporate structure to reinforce its public benefit orientation, but provided few details.

Ambiguous promises are not nearly enough given the stakes. A crystal-clear stewardship charter — and a chance to discuss and debate its contents — are needed before ICANN and the dot org community can even begin to consider whether the sale is a good idea.

One can imagine a charter that provides the council with broad scope, meaningful independence, and practical authority to ensure PIR continues to serve the public benefit. One that guarantees Ethos and PIR will keep their promises regarding price increases, and steer any additional revenue from higher prices back into the dot org ecosystem. One that enshrines quality service and strong rights safeguards for all dot orgs. And one that helps ensure these protections are durable, accounting for the possibility of a future resale.

At the ICANN board meeting tomorrow, directors should discuss and agree upon a set of criteria that would need to be satisfied before approving the sale. First and foremost, this list should include a stewardship charter of this nature, a B corp registration with a publicly posted charter, and a public process of feedback related to both. These things should be in place before ICANN considers approving the sale.

ICANN directors should also discuss whether alternatives to the current sale should be considered, including an open call for bidders. Internet stalwarts like Wikimedia, experts like Marietje Schaake and dozens of important non-profits have proposed other options, including the creation of a co-op of dot orgs. In a Washington Post op-ed, former ICANN chair Esther Dyson argues that such a co-op would “[keep] dot-org safe, secure and free of any motivation to profit off its users’ data or to upsell them pricy add-ons.”

Throughout this process, Mozilla will continue to ask tough questions, as we have on December 3 and December 19. And we’ll continue to push ICANN to hold the sale up against a high bar.

The post ICANN Directors: Take a Close Look at the Dot Org Sale appeared first on The Mozilla Blog.

Mozilla Addons Blog: Extensions in Firefox 72

After the holiday break we are back with a slightly belated update on extensions in Firefox 72. Firefox releases are changing to a four week cycle, so you may notice these posts getting a bit shorter. Nevertheless, I am excited about the changes that have made it into Firefox 72.

Welcome to the (network) party

Firefox determines if a network request is considered third party and will now expose this information in the webRequest listeners, as well as the proxy onRequest listener. You will see a new thirdParty property. This information can be used by content blockers as an additional factor to determine if a request needs to be blocked.

Doubling down on security

On the road to Manifest v3, we also recently announced the possibility to test our new content security policy for content scripts. The linked blog post will fill you in on all the information you need to determine if this change will affect you.

More click metadata for browser- and pageActions

If your add-on has a browserAction or pageAction button, you can now provide additional ways for users to interact with them. We’ve added metadata information to the onClicked listener, specifically the keyboard modifier that was active and a way to differentiate between a left click or a middle click. When making use of these features in your add-on, keep in mind that not all users are accustomed to using keyboard modifiers or different mouse buttons when clicking on icons. You may need to guide your users through the new feature, or consider it a power-user feature.

Changing storage.local using the developer tools

In Firefox 70 we reported that the storage inspector will be able to show keys from browser.storage.local. Initially the data was read-only, but since Firefox 72 we also have limited write support. We hope this will allow you to better debug your add-ons.

Miscellaneous

  • The captivePortal API now provides access to the canonicalURL property. This URL is requested to detect the captive portal state and defaults to http://detectportal.firefox.com/success.txt
  • The browserSettings API now supports the onChange listener, allowing you to react accordingly if browser features have changed.
  • Extension files with the .mjs extension, commonly used with ES6 modules, will now correctly load. You may come across this when using script tags, for example.

A shout out goes to contributors Mélanie Chauvel, Trishul Goel, Myeongjun Go, Graham McKnight and Tom Schuster for fixing bugs in this version of Firefox. Also we’ve received a patch from James Jahns from the MSU Capstone project. I would also like to thank the numerous staff members from different corners of Mozilla who have helped to make extensions in Firefox 72 a success. Kudos to all of you!

The post Extensions in Firefox 72 appeared first on Mozilla Add-ons Blog.