Rust has strengths, but imo, the added cognitive overhead and syntactic noise of having to deal with things like Rc<RefCell<Box<MyType>>> and borrowing vs mutable borrowing are a huge step backwards in terms of usability. Completely sacrifice ergonomics to please the PLT gods.
Understanding C can genuinely help you understand how computers work on a deeper level. Is it the only way? No. But it's an effective way to deepen your understanding.
There are fundamental concepts you may never grasp if you've never used a systems language, eg cache locality.
“You dont need to learn C, I didnt”
Anyone telling you that is limiting you in favor of their ego
Most of the worlds software is in C/C++
Operating systems, browsers, game engines, your favorite programming language
Is most likely written in C/C++
It certainly helps to
My teammates and I are working on building a Ruby JIT compiler inside MRI at Shopify, and we're hiring. If that sounds interesting to you and you think you might like to work with me on this project, please reach out :)
#compilers
I've come to realize that feeling useful is a basic human need. When I don't feel like I'm contributing something worthwhile, working on something I truly believe in, I feel terrible. It may seem obvious to you, but I didn't realize this until I was past 30.
Zig is working on implementing its own backend to replace LLVM 🤔
As someone with experience implementing a compiler backend, I think that getting an MVP working is not that hard, but beating LLVM's performance will take a lot of work.
An underrated performance advantage that systems languages have (vs say Java) is the ability to design tightly packed memory layouts. If your program needs to access lots of data, memory bandwidth and cache efficiency starts to matter. In C/Rust, you can make every bit count.
I hate it when languages try to be "smart" and return some weird type when I'm expecting an array/list. It's super awkward to have to deal with some LazyWeirdVoodooShitIterator or FunkyWeirdIterable that's not interchangeable with a normal f'ing list.
I'm really excited that our Ruby JIT compiler is now part of Ruby 3.2 & live in production, serving requests all across the world.
Thanks Shopify, Ruby & Rails Infrastructure, and Ruby Core for the opportunity to lead a dream team and the support to make all of this possible!
All storefront requests are now served by the latest version of ruby with YJIT enabled! We are seeing ~10% speedups across the board.
YJIT that has been developed by
@Love2Code
's and team at Shopify.
Functional programming means algebraic thinking. Algebraic thinking requires types to be as interchangeable as possible. Interchangeable types means fewer types. Therefore, functional programming should by definition strive towards minimalism. Fewer types, less complexity.
I hate it when languages try to be "smart" and return some weird type when I'm expecting an array/list. It's super awkward to have to deal with some LazyWeirdVoodooShitIterator or FunkyWeirdIterable that's not interchangeable with a normal f'ing list.
VSCode is becoming an operating system. It has a file browser, a package manager, auto updates, a privilege system (trusted windows) a weird shell (command palette) and the ability to connect to remote machines.
The obsession with "memory safety" is annoying
1. How provably safe is your program if it's full of unsafe blocks?
2. There are many other kinds of safety besides "memory safety". Eg you can't claim that your system is "safe" if it can panic at any time
The idea that more data will automatically lead to deep learning outperforming humans is very much cargo cult IMO. Bigger scale isn't always the answer. It's possible that your model, because of its structure, can't make the inferences necessary to solve the problem effectively.
To those who write C++ on a regular basis, do you feel like the language has improved with recent additions or gotten worse? My (outside) impression is that C++ keeps getting more complex and it's not a good thing. AFAIK C++20 modules are still incomplete in GCC and Clang.
I'm coding a toy programming language in Rust. Just me and the code. It's nice to write code just for fun.
It's going to have a top-down recursive descent parser that will parse a JS-like syntax directly into bytecode, without a lexer, and without building an AST.
Bought the audiobook "the 48 laws of power"... Then returned it. Had glowing reviews, but it's a book about how to subtly but efficiently fuck people over while always looking over your shoulder. I can't live my life like that. I don't think it's good for anyone's mental health.
Took me a while but I managed to get this working on my little VM. A grid that smoothly scrolls towards the camera, with perspective projection, and an HSL sky gradient, compiled by my toy C compiler to UVM's bytecode format, rendering at 160FPS :D
Cue Tron Legacy - The Grid
@ylecun
@cdossman
You're making the wrong comparison. Genome encodes structure and initialization. The structure of an LLM is encoded by its source code, which can fit in only a few tens of kilobytes.
Also, chimps may not have our language skills, but they're incredibly smart.
I've been getting the sense, talking to people, that users are updating to the latest Ruby so they can use YJIT and get associated performance gains. A few years ago, it was common to be several Ruby versions behind. Now big companies are on the latest. Performance is a feature!
Nick Bostrom popularized a thought experiment where an AI is tasked to maximize production of paperclips, consumes the known universe in the process.
... Just replace paperclips with crypto. A huge chunk of the silicon produced by TSMC is going into GPUs for crypto mining.
Thanks to the sustained hard work of my teammates on the YJIT team and in Ruby & Rails Infrastructure, we've been able to speed up end-to-end request times on Shopify's StoreFront Renderer (SFR) by as much as 17% on average, and even the p99 request time is faster!
@ShopifyEng
The idea that all programming will soon be automated by AI, so we don't need teach programming, or care about code complexity, fixing problems, etc...
It's a bit like thinking, AI will soon be generating all content, so no need to write or learn critical thinking? Just consume.
Users keep reporting issues running my gridworld headlessly on clusters due to the PyQT dependency, so I'm building a renderer using NumPy only. No more PyQT, and no xvfb needed to produce an rgb array. Still 1200+ FPS with one thread 👌.
#minimalism
I feel like there could be a market for a premium brand of Linux laptops. Slightly higher cost for high quality hardware, good battery life, with Linux and development tools preinstalled, known working drivers and config, no bloatware. Something pragmatic, for professional devs.
Code that I wrote on a machine with Python 3.5 doesn't run on machines with Python 3.4. They added new unpacking syntax in 3.5 which I had been relying on. Getting Python code to run reliably everywhere is an effing nightmare, and it's because of things like this.
During my PhD, there's one specific paper that got rejected 3 times before being accepted. Each iteration, me and my advisor spent 4 to 6 months working on. We worked hard, believed in our work, reviews seemed unfair. I cried every single time that paper got rejected.
Am I the only one who misses the more lighthearted sci-fi shows of the past (eg: Stargate, TNG)? Seems nowadays everything has to be dark, gritty and depressing. Me, I think there's enough drama in the real world, I don't really need fictional shit to worry about.
We have no way to inspect modern CPUs for hardware backdoors as they are way too small/complex, but I was thinking: if you could fab a simple 32-bit RISC-V chip with an older process, with a single silicon layer, it might be possible to inspect it with a plain microscope.
Seems to me unless you're pasting data from the same program you copied it from, "paste without formatting" should probably be the default. You should have users right-click for "paste with formatting" instead.
When it comes to interpreters/CPUs/VMs, I often see people mixing up the definition of "instruction" and "opcode".
An opcode is the kind of operation you want to perform (e.g. add). An instruction includes an opcode and potentially multiple operands (arguments).
@cdossman
I'm not.
There just isn't enough capacity in the genome.
Your entire genome fits in 800MB (uncompressed).
The difference between the human and chimp genomes is 1% of that, or 8MB.
Not enough to encode a significant structure.
For comparison, a small 7B LLM requires 14GB.
@tobi
I was secretly hoping you'd publicly announce it so I could tell my friends and celebrate this a little bit during the holiday ^_^
Thanks for creating a culture that supports risky bets Tobi! :)
People act like the rise of locked-in computing ecosystems is inevitable. It's not inevitable if you do something about it. There will always be a need for open systems, both for developers, and for businesses who don't want vendor lock-in or price gouging.
I'm working on a programming language slash IDE for sound synthesis (similar vein to csound/supercollider), and I was wondering if you guys could help me find a cool name for it. Please reply with ideas :)
I’ve used this sound card for over 17 years. Bought it on September 11, 2001 and carried it from one computer upgrade to another. Sadly now switching to a motherboard that doesn’t have PCI slots! I hope the onboard sound on this one isn’t as noisy as it usually is.
Once we get to the point where machines can design machines, they're quickly going to evolve into something we can no longer understand (or troubleshoot). We'd better be god damn sure we didn't leave any bugs in the code... Good thing humans are so good at writing bug-free code.
Many people seem to have this cynical attitude that if you can't immediately beat the state of the art, it's not worth doing. That attitude is completely idiotic and anti-progress.
Today’s accomplishment: getting a Jetson Nano to run stably off of lithium ion cells. Most of the buck converters I tried didn’t cut it. Chinese parts off ebay tend to exaggerate their specs.
Bought a stationary bike and was horrified to find after assembling it that you need to sign up for a paying monthly subscription to "activate" it. You can cancel after, but they make it hard, and they force you to enter your CC number to get the bike working. Scammy as fuck.
I used to be kind of addicted to YouTube, watching way too many videos everyday, but the recommendation algorithm is so terrible, always re-feeding you more of the same, that the problem seems to have taken care of itself. I open up the app and instantly feel bored.
I got an electrician to come do the wiring so I can have a permanently installed disco ball in my living room. There's a switch on the wall controlling the motor and an RGB LED projector. The motor is embedded into the ceiling 🕺💃
Google Maps keeps informing me that my daily walk to work is a 3 minute drive, with no traffic. I guess people in Mountain View can't imagine walking places.
@AgileJebrim
I feel like this is terrible on multiple levels. This is not code that ever needs to be optimized for performance, and doing SIMD intrinsics by hand should be reserved for the most performance-critical kernels.