Show newer

I've just upgraded to Mastodon v3.4.6.

In addition to the security fix, I now use prebuild container images.

Did I write "container images" rather than "docker images" on purpose? Absolutely! I don't use docker, I use podman and systemd to run Mastodon.

It could have been a good option for a data center, but not for an office desk.

My next idea was to try liquid cooling. I got a GPU mounting bracket from NZXT and Corsair fan from e-bay.

The end result looks great and is quiet enough.

My noise baseline is the fridge. My laptop and the GPU enclosure are quieter than the fridge.

Show thread

I had a broken Titan card and decided to use a fan from there.

I replaced the heat sink, glued the fan. The result worked, but still was loud.

To mitigate the issue, i got a fan controller. It slowed down the fan when the card was cool.

Show thread

Initially I used a fan that is attached on the side of the card.

It did keep the card cool, but was very loud. The noise was too loud to do any work.

Show thread

I'm using the card in an external enclosure. In theory, I could have gotten required airflow but I decided to go alternative way.

Show thread

More on my cheap GPU experience.

I got a Tesla K40m. This card is meant to be used in workstations and datacenters. It doesn't have a fan. Instead, it has a massive heat sink. It's assumed that there is enough airflow to keep the thing cool.

A training step on a CPU in a intel/intel-optimized-tensorflow-avx512 container takes 138 ms.

It is slower than my old GPU, but it might be fast enough to get the first version of a model.

Show thread

A training step takes 38 ms on an Nvidia K40 which I got for $100.

On Google's Colab, a training step takes 21ms. (I don't remember what GPU I've used).

Colab is not expensive, but it is annoying to do long training runs as a connection is likely to drop.

I'm willing to compromise speed in favor of ease of development and early testing on a local machine.

If needed, the final model can always be trained in the cloud on a beefy GPU.

Show thread

The conda environment.yml that I used to build the environment is here

LD_LIBRARRY_PATH might be needed to be redefined, I used this command to fire up a notebook


Show thread

so i gave up on conda and install what i need with pip...

Show thread

I've spent to much time trying to install Tensorflow with the packages I want in a conda env. For some reason, conda-forge has tensorflow-hub, but doesn't have tensorflow-text. I need both to run a model.

What's the most user-friendly way to get tensorflow up and running?

Dima boosted

@dima you can block certain keywords like "RT", "Retweet", "Twitter" and "Birdsite" to filter out at least some content.

I wish it was possible to filter out Twitter content

to be fair, ai-benchmark assigned the score of 5330 to the GPU and 1397 to the CPU.

Show thread

I got an old, but cheap nvidia gpu (k40) to play with deep learning while i'm searching for a reasonably priced modern card.

To my surprise, the i7-1165G7 CPU (8 cores) is about twice faster than the GPU doing classification of images with a CNN.

Is it something one would expect. Did CPU's get better recently? Is the GPU I got too slow?

it works! Now when I connect any of my laptops, the keyboard and the mouse just work

Show thread
Show older Mastodon Instance

The social network of the future: No ads, no corporate surveillance, ethical design, and decentralization! Own your data with Mastodon!