A training step on a CPU in a intel/intel-optimized-tensorflow-avx512 container takes 138 ms.
It is slower than my old GPU, but it might be fast enough to get the first version of a model.
A training step takes 38 ms on an Nvidia K40 which I got for $100.
On Google's Colab, a training step takes 21ms. (I don't remember what GPU I've used).
Colab is not expensive, but it is annoying to do long training runs as a connection is likely to drop.
I'm willing to compromise speed in favor of ease of development and early testing on a local machine.
If needed, the final model can always be trained in the cloud on a beefy GPU.
The conda environment.yml that I used to build the environment is here https://gist.github.com/dimazest/40571dcec7de84601abdfe7b12445040
LD_LIBRARRY_PATH might be needed to be redefined, I used this command to fire up a notebook
It does work!
I hope this NUMA warning is safe to ignore
it's too early to give up
so i gave up on conda and install what i need with pip...
@dima you can block certain keywords like "RT", "Retweet", "Twitter" and "Birdsite" to filter out at least some content.
to be fair, ai-benchmark assigned the score of 5330 to the GPU and 1397 to the CPU.
I got an old, but cheap nvidia gpu (k40) to play with deep learning while i'm searching for a reasonably priced modern card.
To my surprise, the i7-1165G7 CPU (8 cores) is about twice faster than the GPU doing classification of images with a CNN.
Is it something one would expect. Did CPU's get better recently? Is the GPU I got too slow?
it works! Now when I connect any of my laptops, the keyboard and the mouse just work
It looks like it's possible to pair Bluetooth devices to a dongle that is shared between several OSs. https://wiki.archlinux.org/title/bluetooth#Dual_boot_pairing mentions dual boot, but I'll try to use this trick so that I cold connect a keyboard and a mouse to a laptop that is currently connected to my USB hub.
In recent years I've got a lot of mileage out of Python, but the packaging, building, distribution story is one that has been in constant flux since I've been developing with it. I never feel like I'm doing it right, and the blog post below confirms I have a boatload of code that is apparently doing it wrong. This shifting packaging story is a constant source of friction in this particular ecosystem, one that makes me wish I hadn't started down the Python path.
Computer science, computational linguistics, running, swimming, photography.
The social network of the future: No ads, no corporate surveillance, ethical design, and decentralization! Own your data with Mastodon!