Upgraded from a 2080 Ti to a 5070 Ti. I admit I was hoping for a much greater framerate boost compared to the reviews I saw. Possibly a bit CPU limited with a 5900X, but AM5 doesn't look that interesting to me, few PCIe lanes and expensive.
That said, it made a nice upgrade for running local LLMs, more memory and noticably faster.
I have 2080 Ti mainly for development and I'm now thinking of buying another PC with RTX 5060 Ti for running local LLMs. Comparing the 5060 Ti (16GB) vs 5070 Ti apart from half the price, the former GPU only has half memory bus width of 128 instead of 256 bit for the latter. For running local LLMs is it really worth paying double the price of the 5070 Ti?
Haven't seen the benchmarks for 5060 Ti, but for the 4-series the 4060 also had half the memory bandwidth and that lead to almost directly half the LLM performance.
That said, if you just use it for dabbling, ie not millions of documents and whatnot but more your local ChatGPT, you're talking about 2 seconds vs 4 or whatever. So probably not.
Upgraded from a 2080 Ti to a 5070 Ti. I admit I was hoping for a much greater framerate boost compared to the reviews I saw. Possibly a bit CPU limited with a 5900X, but AM5 doesn't look that interesting to me, few PCIe lanes and expensive.
That said, it made a nice upgrade for running local LLMs, more memory and noticably faster.
I have 2080 Ti mainly for development and I'm now thinking of buying another PC with RTX 5060 Ti for running local LLMs. Comparing the 5060 Ti (16GB) vs 5070 Ti apart from half the price, the former GPU only has half memory bus width of 128 instead of 256 bit for the latter. For running local LLMs is it really worth paying double the price of the 5070 Ti?
Haven't seen the benchmarks for 5060 Ti, but for the 4-series the 4060 also had half the memory bandwidth and that lead to almost directly half the LLM performance.
That said, if you just use it for dabbling, ie not millions of documents and whatnot but more your local ChatGPT, you're talking about 2 seconds vs 4 or whatever. So probably not.