Name of the service provider: G2.com, Inc.
Date of the publication of the report: February 27, 2026
Date of the publication of the latest previous report: N/A
Starting date of reporting period: January 1, 2025
Ending date of reporting period: December 31, 2025
G2.com, Inc., a Delaware corporation doing business in the European Union (“G2”), is the world’s largest and most trusted software marketplace made for business-to-business use. We are dedicated to maintaining the safety, professionalism, and transparency of our platform to maintain the trust of our users. In line with this commitment, G2 welcomes the European Union's Digital Services Act (“DSA”) and its objectives of fostering a safer online environment.
This report is published by G2. As a provider of an online platform, G2 is subject to the DSA’s transparency reporting obligations. This Transparency Report is published in response to the obligations under DSA Articles 15 and 24 and reports information pertaining to the period between January 1, 2025 to December 31, 2025.
For more information regarding G2’s compliance with the DSA, please visit https://legal.g2.com/digital-services-act
This Transparency Report contains information regarding the following topics as they pertain to the DSA:
G2 provides the information below in response to DSA Article 15(1)(a).
This section reports data regarding orders received from Member States’ authorities including orders issued in accordance with Articles 9 and 10.
G2 has not received any orders from Member States’ authorities including orders issued in accordance with Articles 9 and 10.
G2 provides the information below in response to DSA Articles 15(1)(b).
This section reports data regarding notices submitted in accordance with the notice and action mechanism in Article 16.
G2 has not received any notices under Article 16.
G2 provides the information below in response to DSA Article 15(1)(c).
This section reports data regarding content moderation engaged in by G2 on its own initiative with respect to illegal content, absent a user report.
G2 has no data to report regarding illegal content moderation.
G2 provides the information below in response to DSA Article 15(1)(c).
This section reports data regarding content moderation engaged in by G2 on its own initiative with respect to content incompatible with G2’s Terms and Conditions, absent a user report.
There were 23,962 removal measures, as follows:
| Scams and Frauds | Own Initiative | Automated Means | Total Per Subcategory |
| Inauthentic Accounts | 5,756 | 574 | 6,330 |
| Inauthentic Reviews | 5,644 | 94 | 5,738 |
| Other | 9,701 | 2,204 | 11,905 |
| Total | 21,101 | 2,872 | 23,973 |
1 Includes user-centric reasons such as low-quality, offensive or unprofessional user avatars, difficult-to-understand reviews, and reviews irrelevant to the product at hand.
G2 provides the information below in response to DSA Article 15(1)(d).
This section reports data regarding complaints received through the internal complaint-handling system.
G2 received a total of 3,882 complaints submitted to the internal-complaints mechanism, with the following outcomes:
| Decision | Total |
| Decision Upheld | 2,343 |
| Decision Reversed | 1,539 |
Out-of-court settlement body disputes
G2 provides the information below in response to DSA Article 24(1)(a).
This section reports on the number of disputes submitted to the out-of-court dispute settlement bodies referred to in DSA Article 21.
G2 did not receive notice of any disputes submitted to out-of-court dispute settlement bodies.
Suspension
G2 provides the information below in response to DSA Article 24(1)(b).
This section reports data on the number of suspensions imposed pursuant to DSA Article 23.
G2 did not suspend any users under DSA Article 23.
G2 provides the information below in response to DSA Articles 15(1)(e).
On its own initiative, G2 took measures solely using automated means a total of 4,089 times, which included the following metrics:
| Metric | Percentage |
| Accuracy | 80.20% |
| Precision | 91.3% |
| Recall | 21.0% |
Summary of the content moderation engaged in at the providers’ own initiative
Our commitment to trust and transparency is reflected in our long-standing values and the guidelines that govern our platform. Our Community Guidelines ensure that all reviews are accurate, reliable, and unbiased, fostering a fair and trustworthy environment for both software buyers and vendors. To maintain the integrity of our platform, we take action on content that violates applicable laws, our Community Guidelines or Terms of Use. Our moderation process leverages a combination of automated models and expert human review to uphold the highest standards of content integrity.
Reviews power the core of G2. Moderation’s task is to ensure that real and authentic reviews appear on the site. The Reviews Moderation/Content QA team achieves this by responsibly moderating the reviews that appear on G2.com. The purpose of the moderation policy is to:
Meaningful and comprehensible information regarding content moderation engaged in at the providers' own initiative
Our core automated system efficiently analyzes a wide range of user-generated content, including software reviews and vendor responses. This system's moderation decisions are subject to strict thresholds, and if these are not met, the system defers to human review rather than taking automated action. Approved content remains accessible on G2, while any content that violates our Community Guidelines will not be published. Users have the option to appeal moderation decisions or resubmit content for reconsideration.
Qualitative description of the automated means
Our automated system is based in part on past decisions human reviewers have made regarding whether content violates G2's community guidelines. G2 continuously monitors the overall performance and accuracy of this system, setting minimum thresholds to ensure reliability. The system's moderation decisions are subject to strict thresholds, and if these are not met, the system defers to human review rather than taking automated action. G2 periodically retrains its automated system to adapt to evolving content trends and updates in human moderation decisions, ensuring ongoing accuracy and fairness. Users have the ability to appeal moderation decisions if they believe an error has been made.
Qualitative description of indicators of accuracy and possible rate of error of automated means
G2 monitors the accuracy of its automated moderation system by comparing its decisions to human review decisions. The system's performance is measured using accuracy indicators such as precision, recall, and false positive/false negative rates. Audits are conducted where a sample of automated decisions are reviewed by human moderators to measure accuracy. Additionally, user appeals provide a feedback mechanism that helps refine the system’s accuracy over time.
Specification of the precise purposes to apply automated means
G2 uses automated moderation primarily to enhance efficiency and consistency in review moderation. Generally automated means focus on fraud detection, spam detection and guideline enforcement.
Safeguards applied to the use of automated means
At G2, the following primary safeguards are applied to the use of automated means for content moderation: