About 56,500,000 results
Open links in new tab
  1. Tesla K80 uses? : r/homelab - Reddit

    Welcome to your friendly /r/homelab, where techies and sysadmin from everywhere are welcome to share their labs, projects, builds, etc.

  2. Nvidia Tesla M10 - good for anything? : r/homelab - Reddit

    Nov 16, 2023 · M10 is Maxwell, quite old so lacking features like tensor cores for fast float16 compute, any support for bfloat16/TF32, not great for Plex (super old version of NVENC hardware), and would …

  3. Regarding NVIDIA TESLA M40 (24GB), is it the same as an RTX ... - Reddit

    Mar 7, 2023 · Regarding NVIDIA TESLA M40 (24GB), is it the same as an RTX 4090 (24GB) for chat AI? If we assume budget isn't a concern, would I be better off getting an RTX 4090 that already has …

  4. Elon Musk Unveils Tesla AI5: 10x Power by 2025 - Reddit

    Jun 21, 2024 · Tesla Inc. is an energy + technology company originally from California and currently headquartered in Austin, Texas. Their mission is to accelerate the world's transition to sustainable …

  5. Nvidia P40, 24GB, are they useable? : r/LocalLLaMA - Reddit

    May 7, 2023 · Given some of the processing is limited by vram, is the P40 24GB line still useable? Thats as much vram as the 4090 and 3090 at a fraction of the price. Certainly less powerful, but if vram is …

  6. Is the nvidia P100 a hidden gem or hidden trap? - Reddit

    The Nvidia "tesla" P100 seems to stand out. 16GB, approximate performance of a 3070... for $200. Actual 3070s with same amount of vram or less, seem to be a LOT more. It seems to be a way to run …

  7. Nvidia CEO explains why Tesla's use of AI is 'revolutionary'

    Tesla Inc. is an energy + technology company originally from California and currently headquartered in Austin, Texas. Their mission is to accelerate the world's transition to sustainable energy. They …

  8. Nvidia Tesla for machine learning & AI lab : r/homelab - Reddit

    May 23, 2024 · Depends what you wanna do. Running a Chatbot? Dont need more then 1 card in a homelab. Do some parallel picture generation or neural network stuff? Then maybe, depends on the …

  9. Worth to get a Tesla P4? : r/homelab - Reddit

    I am thinking about adding a GPU to my homelab, mostly to support video processing and to play around with some self-hosted generative AI apps. My current homelab setup limits me to either 1 slot …

  10. Nvidia Tesla K80 : r/LocalLLaMA - Reddit

    Aug 19, 2023 · LocalLlama Subreddit to discuss about Llama, the large language model created by Meta AI.