(Updated) Introducing the Content Report Feature in Commons: Empowering Users and Admins to Maintain Quality

KataKeri
Instructure
Instructure
19
2122

Canvas.png

Update: Production release has been delayed and it is TBD at the moment - we want to make sure that this feature goes out to production in a form which is useful for our users.
This feature can be tested in beta. Don't hesitate to try it out and provide feedback on it. 

The Canvas Commons team is committed to creating a platform where high-quality, trustworthy resources thrive. In response to valuable feedback from our Community, we’re proud to unveil the
Content Report feature—a tool designed to empower users and admins alike to maintain the quality and integrity of the platform.

KataKeri_0-1726150470074.png

What Is the Content Report Feature?

The Content Report feature allows users to anonymously report resources they believe are:

  • Inappropriate
  • Copyright-infringing
  • Spam

When a report is made, it gets sent directly to the Stats page, located under the Admin menu. Admins can see the number of reports and the types of issues flagged, making it easier to manage resources appropriately.

How It Works

For users:

  • If you encounter a resource that doesn't meet the standards, click the Report button and select the reason: Inappropriate, Copyright Infringement, or Spam.
  • Your report is sent anonymously to admins for review.

For admins:

  • Navigate to the Stats page under the Admin menu to view the ‘Reported’ column for each resource you have access to.
  • Clicking on the number you can see both the volume and type of reports for each resource, allowing you to decide the best course of action—whether that’s editing, removing, or leaving the resource as is.

KataKeri_1-1726150469974.png

Why This Matters

The ability to report content not only strengthens the integrity of our resources but also deepens the collaboration between users and admins. With this streamlined reporting system, Commons becomes an even more secure and trusted platform for all.

The Content Report feature is live in beta and will be available in production later - TBD.

Your voice is essential—let us know what you think about this feature and how can ensure further quality in Commons! 




19 Comments
chriscas
Community Coach
Community Coach

Hi @KataKeri,

Thanks for letting us know about this new feature!  I like the general idea here of having a way to report problematic content.

As a Canvas admin myself though, I can tell you that I rarely visit Commons in my regular work routine.  I wonder if there could be a way for admins to get notifications to email when content form our institution is reported.  I know I could try to modify my weekly/monthly routine to visit Commons more, but the reality is that many of us have so many things we're responsible for that this one is probably going to get missed pretty often without some kind of notification.  Do you have any thoughts or comments on this idea?

-Chris

JamesSekcienski
Community Coach
Community Coach

@KataKeri 

Thank you for sharing an update on this new feature coming soon!  This looks like a nice benefit to help maintain appropriate content in Commons.  I agree with @chriscas that it would be helpful to have a notification that admins could be subscribed to about reported content.  That is one of the issues with reported profile photos too since it can be easy to forget to go and check if there are new reports to review.

Is there a reason that the user making the report would remain anonymous?  If we review the resource and don't notice an issue, how could we follow-up with the user to get clarification?  For example, perhaps we don't know what the copyright infringement applies to or perhaps we are unclear on what they are considering to be inappropriate.

In addition, is there anything to prevent a user from making a surge of false reports?

TrinaAltman
Community Participant

@KataKeri 

Just wanted to voice agreement with the suggestions and questions from @chriscas and @JamesSekcienski. We had similar feedback about the discussion reporting feature being anonymous - for Commons, how can Admins follow up if they aren't understanding why a resource was reported? Also can there be a text box for the reporter to explain why they are making the report? Otherwise, the person evaluating it only knows a broad tag for an entire resource which may have a lot of content, but not specifically where the suspected violation occurs within that resource.

For profile picture reporting, we ended up using a theme customization to change the reporting functionality to a web link that points to a page with guidance for users on considerations when reporting something and how to open a report with our Office of Compliance, Ethics, and Equal Opportunity since they are the office on campus who would need to adjudicate such complaints, not Canvas Administrators. (We worked with our CEEO office on this change and the specific wording for the page before making the change.)

I feel the same way about Commons Reporting. 1) Canvas Admins should not have to manually go check an area to see if anything's been reported. 2) We are not the institutional experts on whether something is Inappropriate or a Copyright Infringement. Is there a way for us to disable or change this functionality for Commons resources that are tagged under our instance (like we did for Profile picture reporting) so it doesn't put us in a position of either not knowing something has been reported, or not knowing what to do if it is (or having to be a go-between between between the report and the office that would make a determination)?

KataKeri
Instructure
Instructure
Author

hey @TrinaAltman @JamesSekcienski @chriscas,

Thank you so much for the feedback!
A little bit of background information: This was a Hackweek project within Instructure and this is only phase I. What you've mentioned are totally valid, and can (and should) be further improvements.

Notifications, providing reason for reporting, enable/disable the feature are great ideas, and I'll take these back to our team to take a look at.

The reason for anonymity at the moment is that I'm unsure about how we can make this work without privacy issues. I'm taking this to our legal team to figure this out, as I see the use case of knowing who made the report/to contact the reporter.

At the moment there is no prevention for making false reports, but it seems like the further improvements (attached name, giving a reason) could be one.

This phase is only for giving a heads-up to the admin about possible problems with shared resources. It is up to the admin to take an action, completely. They can also ignore the the reports if they will. 

 

TrinaAltman
Community Participant

@KataKeri 

Thank you for the response and info and for taking the feedback back to the team. I am glad to hear this is only Phase 1 and further improvements should be forthcoming. As it stands now, I personally don't feel like the feature is ready for us to use and I really don't like the idea of just ignoring reports. But if reports were to come in as it is designed now, we'd be fairly limited in how we could understand and deal with them.

Instructure provided schools the option to disable discussion reporting when it became clear the feature was (similarly) immature. I would love to see the same option for the Commons reporting so that we can decide when it is mature enough to be useful for our institution.

Regarding the privacy issues for identifying the reporter: If the legal team determines it would be an issue to automatically identify the reporter (with notice, I'd assume), maybe it could be optional for the reporter to identify themselves. I imagine many who were to report something would actually appreciate a conversation about it with the ability to provide clarification so that their report can be evaluated more thoroughly and carefully.

KataKeri
Instructure
Instructure
Author

@TrinaAltman My plan is to delay the prod release date until we add an option to enable/disable this feature. I'll discuss this with the development team, and will provide an update on this blogpost and on the release notes too. Again, thank you so much for your feedback - this is exactly why I wanted to run it by our users. 

TrinaAltman
Community Participant

That's great news @KataKeri - thanks for your responsiveness on this! I would also request it is disabled by default so schools can decide if/when they want to use it, instead of enabled by default which would put it in the interface for some period of time until schools could turn it off. It was a bit awkward for us with the Discussions Reporting since it was already visible to users for a period of time before the option became available to disable it.

KataKeri
Instructure
Instructure
Author

@TrinaAltman No problem. Features like this we release with default off most of the time - this would also make sense. 

TrinaAltman
Community Participant

Hi @KataKeri , I see the date in the Commons Release Notes for the reporting feature has been updated to 10/19/24 but I don't see a Feature Option for the reporting functionality listed alongside the other new Feature Options coming in the 10/19/24 Canvas Release, nor an account setting. The Commons Release Notes for 10/19/24 also just describe the new reporting feature but don't mention it will disabled by default. Is a configuration element for the Commons reporting feature still being added (e.g., via a Canvas Feature Option, Canvas Account setting, or Commons Account setting) prior to the release of the reporting functionality? Thanks.

KataKeri
Instructure
Instructure
Author

@TrinaAltman I've updated the prod release date in order not to freak out anyone that this is being released in 2 days in this format.
I picked our next possible release date option. We are still working on a solution how we can make the account setting happen, it may take a while - maybe the prod release will be delayed again, or alternatively maybe this feature will stay in beta until we can deliver all of the improvements.
Since the solution is not trivial, I can't specify it yet in the release notes. As soon as I got something, I will update the release notes. 

I'll discuss with the documentation team whether we can leave the prod release date as N/A, and push updates later. 

KataKeri
Instructure
Instructure
Author

@TrinaAltman @JamesSekcienski @chriscas Would you be up for a feedback interview about this feature perhaps? And in Commons in general. 

chriscas
Community Coach
Community Coach

Hi @KataKeri, I'd definitely be up for a feedback interview!

TrinaAltman
Community Participant

@KataKeri - sure, I'd be happy to participate. 🙂

JamesSekcienski
Community Coach
Community Coach

@KataKeri Yes, I would be happy to participate 😀

KataKeri
Instructure
Instructure
Author

@JamesSekcienski @TrinaAltman @chriscas Much appreciated. Would you please get in touch with your CSMs and let them know to schedule a call with me? (Kata Keri, ConLab/Commons PM). 

SamP
Community Explorer

Hi @KataKeri thank you for these updates! When we had posted about this in February I know one of the big issues was that even when we reported the content through Instructure support, they notified us that they couldn't take down the content either, that only the institution who originally posted it could, and that Instructure support could not get a hold of that institution. This led to the inappropriate content staying in Canvas Commons. 

With this solution would reported content also get flagged for review by a content moderator at Instructure who could potentially take it down? In this case it was a link embedded within a finance course that went to a domain name that had previously been used by an educational website (at the time it was shared to Commons). That domain name had since been purchased and overtaken by a pornographic website who posted explicit content which obviously had nothing to do with finance.  This was discovered by a instructor at our organization who had downloaded a course from Canvas Commons uploaded by another organization. It sounds like even with this reporting feature in place we would have run into the same problem of the content staying. Let me know if I can clarify further, thank you!

jsowalsk
Community Coach
Community Coach

@KataKeri I am trying to understand why someone would want to report their own content? I thought it would be someone wanting to report in a consortium, group, or institution.

jsowalsk
Community Coach
Community Coach
KataKeri
Instructure
Instructure
Author

@SamP @jsowalsk 
The Hackweek project was based on this idea, yes. First, we've implemented in a way that the report goes to the resource's owner, who can edit/delete the problematic content - hoping that it was a honest mistake.

Alternatively, we thought about an option where these reports go to a Canvas siteadmin within our company, who can review/remove the reported content after reviewing it.

This is where I'm stuck at the moment and would like to do further discovery, because I have these questions:
- This would increase the workload for somebody, and I want to make sure this works out for them (as they have to review, remove the reported content on a regular basis). This would require someone to be a Content moderator in Commons. 
- Who is qualified to do this job and decide whether the reported content is indeed copyright infringing perhaps - for harmful/spam content that is mostly easy to decide though (or remove copyright type of report overall).
- Thinking about it (and based on your feedback) a note is needed for the report to know the reason, which can help decide the above. I have to check in with legal whether it should be anonymous or we can link a username/email with the report (privacy reasons).
- Also, an email notification about a report would be necessary, as I learned from our Commons Stats survey that the majority of our users don't even know that there's a Stats page within Commons under Admin. 

A little bit of background on spam removal:
The previous PM and then I was the one who on a biweekly basis went to Commons and 'hunt' some spam material, which then was collected and was deleted by an engineer. Or when it was reported, a support week ticket was created to remove the problematic content. This is what we want to automate to save time and engineering/PM effort and also to make Commons a more reliable site with quality content.

Hence the first version, were you can report to the admin about the problematic content.
I know it is not perfect and needs improvement, that's why we're collecting feedback - to meet you needs - and focusing on our outcomes too.

@jsowalsk - I know we discussed this in email, just for full visibility I leave the answer here too - it's not likely someone will report their own content, I suggest to do this in beta because it's the easies was to try this feature out and see the reports within your Stats. 

We'll carry on improving this feature based on your feedback, but it may take a while. As soon as I have a testable prototype I'll reach out here if you are up for testing. I really like to make this work, and I really appreciate your feedback on this.