Session Overview

In this episode of Tent Talks, Christine McGlade, a sessional lecturer on digital futures at OCAD University, shares her insights on designing tomorrow with a focus on ethics and AI. Christine discusses the importance of futures thinking as a design discipline akin to systems thinking, emphasizing the need for ongoing engagement with the world to anticipate changes. She highlights the challenges of finding trusted primary sources in an era where AI-generated content is becoming increasingly prevalent, leading to a potential "model collapse." Christine also delves into the ethical dilemmas faced by designers in creating AI-driven solutions and the importance of incorporating ethical considerations into the design process. Additionally, she shares her thoughts on the intersection of humor and AI, suggesting that while AI struggles with creating humor, it can be a powerful tool to address ethical issues in AI.

Approaching Futures Thinking in AI:

Futures thinking is likened to a design discipline, stressing the importance of scanning for signals of change.Challenges in finding trusted primary sources due to the proliferation of AI-generated content.The importance of using tools like Perplexity.ai and Google Scholar to access primary sources.

Model Collapse and AI:

Model collapse results from an increase in AI-generated training data, leading to a decrease in the quality of AI outputs.Concerns about data pollution and the echoing of mediocrity in AI-generated content.The introduction of artist-developed countermeasures like Nightshade to protect their work from being used as AI training data.

Ethical Considerations in AI-Driven Design:

The need for designers to focus on the process rather than the outcomes when using AI to generate designs.Encouraging students to demonstrate their problem-solving process, emphasizing that the journey is as important as the destination.The limitations of AI in fully capturing the creative and design process, particularly in art and design.

Humor as a Tool in Addressing AI Ethics:

AI's inability to create humor effectively, especially in sensitive or nuanced topics.The potential for humor to address and highlight ethical issues in AI, despite AI's limitations in understanding or generating humor.Notable Quotes:"Futures thinking... is helping students to foster... a kind of ongoing engagement with the world.""It's actually pretty difficult to find trusted primary sources.""We're not getting innovation, right? And that's the bottom line.""The outcome is not the thing. The road that you travel to get there, that's the thing."Reference Materials:Jeremy Rifkin's "The Empathic Civilization": This book is widely available and can be found on major book retailer websites, such as Amazon, Barnes & Noble, or your local bookstore's online platform. Additionally, it may be available in digital format through platforms like Kindle or Audible for audiobooks.TechTarget Article on Model Collapse:  Model collapse in the context of AI refers to a situation where a machine learning model fails to generalize from its training data, often due to overfitting on synthetic or unrepresentative training data, resulting in the model producing increasingly homogenous or inaccurate outputs. This issue underscores the importance of using diverse and representative data in training AI models to ensure they perform reliably in real-world applications.Nielsen Norman's Publications on Working with AI as Designers: The Nielsen Norman Group is renowned for its research and publications on user experience (UX) design.Nightshade: Nightshade is a tool designed to protect artists' copyrights by transforming images into "poison" samples that disrupt AI model training. It aims to deter the use of unlicensed data by introducing unpredictable behaviors in models trained on such data, making licensing a more appealing option. Nightshade and Glaze serve complementary roles: Glaze protects individual artworks from style mimicry, while Nightshade offers a collective defense against unauthorized scraping, with both aiming to support artists and encourage responsible data use in AI development.

About Tent Talks

Chicago Camps hosts irregularly scheduled Tent Talks with people from all across the User Experience Design community, and beyond. Who really likes limits, anyway--If it's a cool idea, we'd love to hear about it and share it!

What is a Tent Talk? That's a great question, we'd love to tell you.

Tent Talks are short-form in nature, generally lasting from 10-20 minutes (ish) in a recorded format--we like to think of them as "S'mores-sized content" because that's pretty on-brand. Tent Talks can be a presentation on a topic, a live Q&A session about the work we do, or the work around the work we do, or really just about anything--we don't want to limit ourselves, or you.

You should send along an idea or topic of your own so we can learn from you, as well! You don't have to be a published author or a professional speaker on a circuit to be good at your job, so please, put yourself forward, and let's have some fun, talk, and share your experience with others!