Author(s) :   Rajesh Poonia1, Jyoti Godara2
Abstract : The emergence of algorithmic systems in digital marketing has boosted the prominence of automation in the Digital marketing sphere to the frontline in consumer engagement programs. Where the modern technologies of marketing are based on AI, and are intended to enhance personalization and make the user experience complete, recent research points towards the increase of the so-called dark patterns, or interface designs that covertly manipulate consumer choices towards corporate goals. Among them are coercive concurring creeks, blackened opt-out channels, and cheating calls to action, which carry along psychological and emotional predispositions (Brignull, 2010). The current discourse builds a theoretical model that allows differentiating between ethical and manipulative automated strategies in marketing environments. With a reminder of previous literature on trust and transparency, as well as trust towards algorithms and machine learning (Pasquale, 2015; Raji et al., 2020), the framework proposes the concept of Trust-Based Marketing Automation (TBMA) that aims to find a balance between optimization in favor of efficiency and consumer-centric prioritization of fairness. It is argued that dark patterns can create short-term gains in terms of conversions, but that they will co-occur with long-term losses to trust in customers and brand equity (Luguri & Strahilevitz, 2021). The study has presented four interconnected constructs of perceived fairness, algorithmic intent, user autonomy and interface transparency through an interdisciplinary analyst skill that defines ethical boundaries of marketing automation. The conceptual model, in turn, can lead to the ongoing scholarly discourse of the concept of responsible AI in consumption contexts and provide business management with recommendations on how to design trustful digital experiences.
Keywords: Ethical AI, dark patterns, marketing automation, consumer trust, algorithmic transparency, persuasive design, digital ethics.
DOI: 10.61161/ijarcsms.v13i6.4
Pages : 23-37

*Authors are invited to submit papers through E-mail at