About:
Alex Rogozhnikov is a mathematician and physicist with a passion for machine learning and data analysis, sharing insights on his blog 'Brilliantly Wrong'.
Website:
Specializations:
Interests:
Incoming Links:
Subscribe to RSS:
The post critiques the performance plateau of protein language models, emphasizing the need for diverse training data and questioning the assumption that larger models yield better results.
The text discusses the need for fast autograd and the author's journey to find the fastest autograd. It compares autograd in torch, jax, python, rust, C, and assembly, and provides benchmarks for each. The author also experiments ...
The text provides an overview of optical pooled CRISPR screens, comparing two preprints and discussing the technology's uses, effectiveness, and challenges. It also delves into the main selling points of optical pooled screens and...
Einops is a unique and widely used tensor manipulation tool that has been around for 5 years. It supports a dozen frameworks and has faced many technical and conceptual challenges. The project has been resilient despite slow adopt...
The text discusses the shift in paradigm from transition-focused to state-focused in programming, particularly in the context of schema migration in databases. It highlights the benefits of auto-migration tools and proposes that t...
The text argues for the wider usage of delimiter-first in code, proposing a new top-level syntax for programming languages. It discusses the advantages of this method, the controversy surrounding it, and its application in differe...
The post provides a detailed overview of microscopy, covering various types of microscopes, imaging techniques, and challenges in experiment design. It also discusses the limitations and complexities of microscopy, as well as the ...
The post discusses the drawbacks of writing command-line interfaces (CLIs) and suggests using code to parameterize calls instead. It emphasizes that CLIs are unreliable, require constant maintenance, and have poor error handling. ...
The text discusses a trick for better comparison of deep learning models, especially useful for segmentation models. It suggests training models side-by-side within the same process to achieve high similarity in the training proce...
0Einops — a new style of deep learning code
2018-12-06 •
...
The post discusses clustering techniques, particularly k-means and DBSCAN, and their application to the OPERA experiment. It explains the limitations of k-means and the importance of choosing the right distance metric. The author ...