What's the bare minimum gpu needed for reasonable local llm performance? AMD or Intel cards?

Basically this. I want to run deepseek3 locally but right now I don't have a modern gpu (just laptops). I want to know what I need to reasonably use a llm for chat and coding assistance