<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"><channel><title>Hamza Bendemra | AI Engineering</title><description>Writing about GenAI, LLMs and enterprise AI.</description><link>https://bendemra.ai/</link><item><title>Week 1 — Attention Is All You Need</title><link>https://bendemra.ai/posts/week-01-attention-is-all-you-need/</link><guid isPermaLink="true">https://bendemra.ai/posts/week-01-attention-is-all-you-need/</guid><description>The 2017 paper that replaced RNNs with self-attention and became the architectural foundation for every major language model since. Here&apos;s what it actually says and why it still matters.</description><pubDate>Sun, 05 Apr 2026 00:00:00 GMT</pubDate></item><item><title>20 Foundational GenAI Papers in 20 Weeks</title><link>https://bendemra.ai/posts/20-papers-20-weeks/</link><guid isPermaLink="true">https://bendemra.ai/posts/20-papers-20-weeks/</guid><description>I build AI systems for a living but kept noticing a gap between using the technology and understanding the research behind it. So I&apos;m closing it — publicly.</description><pubDate>Thu, 02 Apr 2026 00:00:00 GMT</pubDate></item></channel></rss>