Another Open Source Group Criticizes GitHub Co-Pilot, Advocates Leaving GitHub — Visual Studio Magazine


Another open source group blasts GitHub co-pilot and advocates quitting GitHub

Another open source advocate decried GitHub Copilot, even calling on organizations to quit the open source code repository platform that spawned the AI-powered tool.

GitHub Copilot, an “AI peer programmer” coding assistant, has shaken up the world of software development with its advanced code completion capabilities that can lead to the creation of entire programs with natural language instructions.

GitHub co-pilot
[Click on image for larger view.] GitHub co-pilot (source: GitHub).

Using state-of-the-art AI technology (OpenAI Codex), GitHub Copilot recently achieved general availability status, offered at $10/month.

Turn words into code
[Click on image for larger view.] Turn words into code (source: OpenAI).

It has wowed developers with its ability to complement a coder’s typing intent and even create entire projects – like a simple game – just from entered commands. The new product also provides all-around code suggestions, comprehensive methods, boilerplate code, comprehensive unit tests, and even complex algorithms.

    GitHub co-pilot in animated action
[Click on image for larger, animated GIF view.] GitHub co-pilot in animated action (source: GitHub).

Yet it has also drawn backlash stemming from legal, ethical, security and other concerns, particularly from the Free Software Foundation (FSF), which last year deemed GitHub Copilot “unacceptable and unfair”.

Now another open source-focused organization, Software Freedom Conservancy (SFC), has piled in. Like the FSF, the group is a strong advocate of strict free and open source software (FOSS).

In a June 30 blog post, SFC listed numerous complaints about GitHub’s behavior, particularly regarding the release of a paid service based on GitHub Copilot, whose AI model is trained on source code repositories. High quality GitHub.

“Launching a for-profit product that disrespects the FOSS community like Copilot does simply makes the weight of GitHub’s bad behavior too much to bear,” SFC said in the post.

Additionally: “We are ending all of our own use of GitHub and announcing a long-term plan to help FOSS projects migrate off of GitHub.”

As for this long list of grievances, the group has set up a dedicated site to lay out these and other justifications for leaving GitHub, aptly titled “Give Up GitHub!”

On this list, GitHub Copilot complaints headline:

Copilot is a for-profit product – developed and marketed by Microsoft and its subsidiary GitHub – that uses artificial intelligence (AI) techniques to interactively automatically generate code for developers. The AI ​​model was trained (according to GitHub’s own statements) exclusively with projects hosted on GitHub, many licensed copyleft licenses. Most of these projects are not in the “public domain”, they are under FOSS license. These licenses have terms including proper attribution of authorship and, in the case of copyleft licenses, sometimes require that works based on and/or incorporating the software be licensed under the same copyleft license as the prior work. Microsoft and GitHub have been ignoring these licensing requirements for over a year. Their only defense of these actions was a tweet from their former CEO, in which he falsely claims that the unsettled law on this subject is in fact settled. Besides the legal issues, the ethical implications of GitHub’s choice to use copylefted code in the service of building proprietary software are serious.

SFC said it will not require its existing member projects to leave the platform at this time, but it will not accept new member projects that do not have a long-term plan to migrate off. GitHub. He promised to provide resources to support any member projects that choose to migrate and help out however he can.

The lengthy June 30 post lists three questions related to GitHub Copilot that SFC allegedly posed to Microsoft (which owns GitHub), to which it has not received an answer for a year and to which Microsoft has formally refused to answer:

  1. What case law, if any, did you rely on in Microsoft and GitHub’s public assertion, stated by the (then) CEO of GitHub, that: “(1) training ML systems on public data is fair use, (2) the output belongs to the operator, just like with a compiler”? In the interest of transparency and respect for the FOSS community, please also provide the community with your full legal analysis explaining why you believe these statements to be true.
  2. If it is, as you claim, allowed to train the model (and allow users to generate code based on that model) on any code and not be bound by license terms, why did you chose to only to train the Copilot model on FOSS? For example, why aren’t your Microsoft Windows and Office codebases in your training set?
  3. Can you provide a list of licenses, including copyright holder names and/or Git repository names, that were in the training package used for Copilot? If not, why are you hiding this information from the community?

The group’s standard description reads: “Software Freedom Conservancy is a nonprofit organization focused on ethical technology. Our mission is to ensure the right to repair, improve, and reinstall software. We promote and defend these rights by encouraging free and open source software (FOSS) projects, leading initiatives that actively make technology more inclusive, and advancing political strategies that defend free software (such as copyleft).

In yesterday’s announcement, SFC acknowledged that “we expect this particular blog post to generate a lot of discussion.”

About the Author

David Ramel is an editor and writer for Converge360.

Comments are closed.