Yes, my point is that the compute from those chips can still be used. Maybe on actually useful machine learning tools that will be developed latter, or some other technology which might make use of parallel computing like this.
I know of at least one company that uses cuda for ray-tracing for I believe ground research, so there is definitely already some usefull things happening.
I mean there are a lot of applications for linear algebra, although I admit I don’t fully know in what way “AI” uses linear algebra and what other uses overlap with it.
these are compute GPUs that don’t even have graphics ports
Yes, my point is that the compute from those chips can still be used. Maybe on actually useful machine learning tools that will be developed latter, or some other technology which might make use of parallel computing like this.
I’m waiting on the a100 fire sale next year
I know of at least one company that uses cuda for ray-tracing for I believe ground research, so there is definitely already some usefull things happening.
I mean there are a lot of applications for linear algebra, although I admit I don’t fully know in what way “AI” uses linear algebra and what other uses overlap with it.