More than anything, this hackathon proved a point: marketers can do this. It set a precedent, sparked a mindset shift, opened eyes, empowered the team, and created space to play, test, and push the boundaries of what’s possible.
The twist? None of us are developers—and that was exactly the point.
In 2025, we experimented heavily with custom GPTs and prompting. This hackathon was the next step: testing whether a marketing team can design systems, not just prompts—creating AI-driven workflows and agents to support market research, verification, approval routing, and safe execution.
Instead of asking, “How can AI help me do this faster?” we asked a bigger question:
What if AI could run the workflow, while people focus on judgment, creativity, and direction?
Working as ONE team, marketers set out to Aim High and build something that, not long ago, felt out of reach: agentic AI workflows that connect tools end to end—from discovery and decision gates to approval and execution.
Along theway, we proved something critical: within AI automation, human verification is essential. Approval checkpoints, audit logs, and stop conditions were built in by default, ensuring AI does the heavy lifting while humans remain accountable.
The result is a strong signal of how we approach AI at Ataccama: ambitious yet responsible, experimental but grounded, and always human-centric. When you Aim High and operate as ONE team, the future arrives faster than you expect.
Q. Why a marketing-only hackathon?
David: Another intentional choice was keeping the hackathon marketing-only. While cross-functional collaboration is powerful, this format serves a different purpose.
The goal was to build confidence and sense of ownership. Proving that marketers don’t need to wait for engineering to innovate with AI.
Marketing already builds complex systems—campaigns, funnels, journeys, approval processes. Translating that systems thinking into AI workflows felt like a natural next step. Removing developers from the room wasn’t a constraint; it was an invitation to the team.
And that invitation worked. Early hesitation quickly turned into curiosity. Curiosity turned into experimentation. And then came the quiet but powerful realization across teams: Wait a second. I can actually do this.
Q: What “agentic” meant in real terms
David: When teams were challenged to build “agentic” systems, the word wasn’t left abstract. In practice, it meant designing workflows that could move through steps independently—gathering inputs, validating them, making decisions, routing work, and knowing when to stop.
Agentic didn’t mean autonomous in the wild. It meant intentional orchestration.
Just as important as ambition were the guardrails. Brand consistency, customer trust, and accountability were built into the challenge from the start. Every solution had to reflect how Ataccama shows up in the market.
That’s also why “human in the loop” was non-negotiable. Approval checkpoints, audit logs, verification steps, and stop conditions weren’t optional extras—they were core design principles.
Interestingly, teams didn’t resist this constraint. They embraced it. Human verification appeared naturally in every strong solution, reinforcing a shared understanding: automation without accountability isn’t progress.
At the same time, the hackathon opened conversations about where automation can safely increase over time—once workflows are trusted, observable, and well-monitored.
Q: Could you give examples of the tools that were used during the hackathon?
David: Could you give examples of the tools that were used during the hackathon?
One of the most important design principles of the hackathon was that this wasn’t about showcasing a single “magic” platform. Instead, teams were encouraged to explore how different tools can be orchestrated into meaningful workflows, with AI acting as the connective tissue and humans remaining firmly in control.
Across teams, marketers worked with a mix of AI models, low-code orchestration tools, and everyday platforms they already knew. GPT-based models were used for reasoning, content generation, and decision-making. For visual verification and image analysis, teams experimented with tools like Gemini Vision, Gemini 3m Klinf and Fetch Vision. Discovery and enrichment relied heavily on search-based tools such as SERP API, as well as enrichment services like Hunter for email identification.
Workflow orchestration was a major learning area. Teams explored low-code automation platforms, including n8n, to connect tools end to end—from discovery through verification, approval, and execution. Google Sheets and Google Drive played an important supporting role, acting as shared control layers for data, assets, and handoffs.
Equally important were the collaboration and governance tools. Slack was used extensively for human-in-the-loop approvals, notifications, and decision checkpoints, while Gmail enabled controlled, human-approved outreach. Some teams also explored tools like Apify for social listening and public web scraping, particularly for competitive and account-based scenarios.
What mattered most wasn’t the tools themselves, but how they were combined. The hackathon helped teams realize that AI value doesn’t come from a single model—it comes from well-designed workflows, clear decision points, and responsible orchestration.
Q: Tell us more about the projects the teams worked on—and which pains they addressed
David: The projects presented during the hackathon were deeply rooted in real, everyday marketing challenges. Each team focused on a problem that was manual, repetitive, or difficult to scale—then explored how AI-driven workflows could help, without removing human accountability.
One of the standout projects was Polaris, developed by the Brand team. The pain point was brand protection: discovering outdated or incorrect logo usage across the web is time-consuming, reactive, and expensive when done manually. Polaris automates discovery and verification using AI-powered search and image analysis, then routes findings through human approval before any outreach is sent. The result is a scalable, proactive approach to brand compliance that keeps ownership firmly in marketing’s hands.
The ContentOS team tackled another familiar challenge: content creation that requires heavy coordination across teams and often takes days to complete. Their workflows showed how AI agents can draft blogs, social posts, and emails using internal knowledge and external context—while humans remain responsible for quality and final approval. The project highlighted both the promise and the current limitations of AI, surfacing valuable lessons about workflow design, scope control, and system reliability.
Community engagement was the focus of Community Buddy, an AI-powered assistant designed to support users in Ataccama’s community channels. The team addressed the pain of repetitive questions and delayed responses by creating a workflow that suggests answers, resources, and next steps—always reviewed by humans. The goal is not to replace support, but to reduce noise, improve response times, and allow experts to focus on more complex issues.
Other teams explored AI in areas like competitive intelligence and account-based marketing. The Competitive Scout project aimed to reduce the manual effort required to keep battle cards and market insights up to date, while the ABM Agent focused on automating account research and personalized outreach. Both projects surfaced the same core insight: AI can save significant time, but only when workflows are structured, inputs are clear, and decision points are explicit.
Across all projects, the pattern was clear. The hackathon wasn’t about flashy demos—it was about solving real problems, learning through friction, and discovering that marketers can design powerful AI systems when the mindset shifts from using tools to building workflows.
Ask about the biggest takeaway, and it’s not a specific demo or workflow.
Q: What are the most important outcomes of the marketing hackathon?
David: Looking ahead, the team is already thinking about how to measure impact. Efficiency gains will matter, but so will quality, consistency, reuse of workflows, and confidence across the team. Some metrics will be quantitative. Others will be cultural. As an organisation we are shifting towards AI-driven and people first company, that scale business, while scaling people’s skills.
Because the most important outcome of the hackathon wasn’t a tool.
It was a mindset shift.
When teams Aim High, operate as ONE, and trust themselves to experiment responsibly, the future doesn’t feel distant or intimidating. It feels close. Buildable. And unmistakably human.
If you’re passionate about AI driven marketing, join us on the journey. Check our open roles