I understand the intent of this policy, and I appreciate the commitment to fostering a culture of meaningful and original contributions! But I do wonder whether it may be helpful to carve out an exception for AI content that is posted specifically to illustrate a problem with the AI ouput, as I’d intended to do here. I’m concerned that if people only ever see the “corrected” versions (and not how many changes were needed to turn the AI output into valid TW syntax), they’ll come away with some misapprehensions about the general accuracy and reliability of AI-generated code.
Is it reasonable to share such cautionary examples as screenshots, to minimize the likelihood that they’ll get scraped by bots or that someone will try to incorporate them into their own wiki?