It is with immense pleasure that
@KathiFisler
, Ben Lerner,
@joepolitz
, and I announce the first version of our new book, DCIC: a Data-Centric Introduction to Computing. This brief thread explains the book a little. 1/10
I've read multiple times that Reddit 1.0 was written in Lisp. I didn't realize the source is public.
It's amazing. You can read the whole thing in one sitting. Even an undergrad could. It's like the essence of aβ¦Reddit.
We took a wrong turn w/ software.
1/ Many will tell you why Python is great for teaching coding, so I'll tell you ways it's not.
State is a bad default. It should be legal but safe & rare. The arc of programming is long and bends towards immutability. Its early use creates messes (eg, "a variable is a box".) β΅
1/ CW: Holocaust. Since most of you will never visit Erfurt, a short thread on how engineering prowess devoid of humanity can lead to the most evil kinds of technological progress. A warning for all techies. I'll start in the next tweet for those for whom this is too much. β΅
Proposal for a new kind of test-of-time award: conferences look at the papers they REJECTED N years ago, find the most impactful, and give it an award. We have the Grammys, Emmys, Tonysβ¦so this one we should call the Oopsies.
Just got a paper rejected from SIGCSE for not understanding the literature on program-by-design as personified by /How to Design Programs/ by Felleisen, Findler, Flatt, and Krishnamurthi. (-:
[Fun supplement: Reviewer almost certainly outed themselves, in the process.]
Jeff Siskind has sent me email with the full account of the development of STALIN and it's every bit as epic as I remember. I would like you to contemplate the absolute beast-mode coding that we are talking about here. I do not joke when I say this should be in a computer museum.
I believe Jeffrey Siskind wrote much of the STALIN compilerβone of the most impressive of all timeβlargely on the moral equivalent of a calculator (a ~3-line LCD), I think while riding the Toronto metro. Absolute God Mode programming.
Maybe
@BAPearlmutter
can confirm.
Was curious to see how this would play out. Sure enough:
KarikΓ³: "Then years ago I was kicked out from Penn and was forced to retire".
Penn's home page: "Katalin KarikΓ³ and Drew Weissman, Pennβs historic mRNA vaccine research team, win 2023 Nobel Prize in Medicine"
βTen years ago I was kicked out and forced to retire.β
Our new medicine laureate Katalin KarikΓ³ (
@kkariko
) told us how much it means to be awarded the Nobel Prize after a scientific career that has been full of challenges.
Ten years ago, KarikΓ³ was still doing all her
I'm the opposite. If there's such a high chance someone else will scoop you, you're working on the wrong problem. Someone else will solve it anyway; move on to something that not enough people are thinking about. Paranoia is unhealthy; if it's necessary, switch problems. Solved.
As I say to my students... be paranoid: someone is working on your idea right now. (what opportune timing since most of the ACL flag plants were just posted moments ago on arXiv)
My best guess: Rust got people thinking, I want a lang
- with useful types (no Python/C)
- not neutered (no Go)
- has regular state (no Haskell)
- isn't legacy-warped (no Scala)
- doesn't cause pain when you don't need it (no Rust)
and that leavesβ¦
Is that it?
@sabine_s_
I'm sort of baffled by the sudden explosion of OCaml in my TL, but given that I've been yelling at people online to learn Standard ML since, oh, 1995, I'm down for it.
That said, OCaml sort of is the Rome β more modern, more decadent β to SML's Greeceβ¦
The way computer science students always obsess about immediate, low-level, peephole-level operations for "efficiency" and completely disregard, e.g., data representations, which have vastly bigger impact, is a great illustration of the availability heuristic/bias.
My FAQ on getting a computer science PhD in the USA. Answers to over 35 questions.
Happy to hear feedback from faculty who disagree with some of these answers.
Give me a fucking break Georgia Tech. I'm tired and busy and have a million other things to get done before tomorrow to figure out the difference between 1% and 2% (precision!) in *13* areas. If you don't take my student, be assured it'll be your loss. As my letter says. Read it.
@MarkMoyou
@linylinx
"8 papers" is not a "specific skill" β it's not a skill at all. Actual skills (packages, languages, mathematics, concepts, etc.) are. This has clear indications of having been written to obtain a particular pre-selected candidate, given the very specific number.
@paulg
11yo's teacher asked students to go ask parents to change allowance: 1c on day 1, doubling every day, reset end of month. Apparently several parents went for it. (We gave ours a strong glare. She admitted she didn't think we'd fall for it.)
Very sad to announce that Eugene Charniak, a statistical NLP pioneer ("Charniak parsing"), inaugural class ACL fellow, etc., passed away this morning.
Here's an article written at his retirement, looking back on his career.
I'm devastated to hear that William Cook passed away on Wed. Much of 1990s OOP was defined by his seminal papers. When he returned after a decade in industry (AppleScript!) I invited him to
@BrownCSDept
(where he got his PhD from Peter Wegner) and we became friends. Tragic.
I read that people were calling the Obsidian note-taking app a "cult", and it seems like every two years there's another new hot note-taking app. My conjecture: the hard part is thinking clearly, and people keep hoping the next one will help with that. (Spoiler: likely won't.)
I'm excited about the new Verse programming language from epic games, but I'm also terrified about all the awful paper titles it's going to spawn: Verse is Better, For Better or Verse, etc.
1/ "What programming language should I teach?" is the least productive question to ask in computing. There's a good reason: it's the wrong question to ask. The reason language wars feel pointless is that they're a symptom of this problem. Here's why: β΅
With others, I've been building programming environments for students for a long time, & I've learned several design principles. I decided to write them up, including an unpublished one that has driven our past few years of work. "What is a Pedagogic IDE?"
Why do I freakin' love my job? I left open how to do subtraction with Church numerals (having worked up through multiplication), and a student FIGURED IT OUT HERSELF. (Church himself was stumped!) She's applying to some of your PhD programs. (-:
I just love watching programmers discover sensible languages.
Sure, we've had this for literally decades, but let's welcome such people and grow the community.
Am I the only person who finds it ludicrous that GitHub Copilot is free for students but not for instructors, who need to test how it will mess up their assignments? Is this Microsoft's idea of being a good citizen and friend of academia?
Very few people know this but when designing the original
@racketlang
logo, we realized the lambda had another meaning, thought about it for a second, and decided all the more reason to go for it.
So a pride version of it isβ¦actually especially appropriateβit's always been one.
1/ For several years we've been building Forge, a pedagogically-focused tool to teach formal methods. Here's our first write-up! As the name suggests it's a tribute to Alloy, but we're innovating on several fronts:
β΅
This talk was a labor of love for me. This paper (Felleisen's "On the Expressive Power of Programming Languages") changed my life. I was delighted to revisit it 25 years after reading it. [threadΒ»]
@yar_vol
Yes, I've written software, thanks.
And it's ironic that you would bring up iOS apps, on which Reddit has such a strong reputation for qualityβ¦
Me, complaining that Jeff Siskind,
my laptop is slow, writing STALIN,
my keyboard is crap, the world's
and the screen could greatest ever
perhaps be a bit bigger: optimizing compiler:
Jeff Siskind has sent me email with the full account of the development of STALIN and it's every bit as epic as I remember. I would like you to contemplate the absolute beast-mode coding that we are talking about here. I do not joke when I say this should be in a computer museum.
Begging AI courses to step back from their giant linear algebra number crunching machines and ask students to once again implement and play with trivial and puny Eliza.
Just had a quite emotional, personal conversation w/ ChatGPT in voice mode, talking about stress, work-life balance. Interestingly I felt heard & warm. Never tried therapy before but this is probably it? Try it especially if you usually just use it as a productivity tool.
Property-based testing is an important upcoming topic in software reliability, but it gets no attention in computing education. We've been trying to fix that. This blog post summarizes our work and points to a recent paper with lots more detail. Β»
Kid tells me about an optional math homework she chose not to do. I ask why not.
K: It's all about order of operations!!!
M: So?!?
K *rolls eyes*: Just use parentheses and you're done.
[exits stage right]
#parenting
People who complain "CS students today don't even take X!": do you know how much CS has grown? I pulled up the
@BrownCSDept
course list and marked in yellow all courses that did NOT exist ~20y ago (when many grumblers graduated). YMMV, but there are way, WAY more things to learn.
10/ The next generation of computing problems will not be about writing 80s style 5-line for-loops. It'll be about properties, specification, reasoning, verification, prompt eng, synthesis, etc. How will we get there?
And no, I will not be taking questions. (-:
I love my job. At 9am I taught Dijkstra's algo, at 11 I'm doing Hindley-Milner. Two Turings in one day!
H-M is one of the most beautiful algos and yet is ~never done in algo classes. Also illustrates a profound principle: decompose HARD problem into 2 easy steps, one v reusable!
Serious question: Why do you think recursion is hard?
I feel much of the recursion literature in CS Ed is (IMO) pretty crap. I'm interested in hearing what you think makes it difficult to learn or use. (If you never found it hard, I'm not interested in your ramblings. <-;)
What a country: It launches people to the stars at the same time that it shoves necks to the ground. One is awe-inspiring, the other beyond disgusting. In that sense 2020 is just like 1968. (Apollo successes happened to a backdrop of King's assassination, the DNC Convention, β¦.)
The Republican Party's 2020 platform is exactly the same as its 2016 platform. That means it condemns the βcurrentβ president β who in 2016 was, of course, Barack Obama β and calls the White House a risk to "the survival of the internet."
1/ Once you look past syntax and "paradigms", many programming languages (Java, Python, Racket, β¦) share a common semantic core. But students seem to understand it very poorly, which leads to endless confusion (as often seen on here). What to do? β΅
1/ Several people have asked me to summarize my exploration of the low-code/no-code space. Here's what I learned. Note that this is VERY temporal: what's true today may not be in one month (especially with so much VC money sloshing about). Also, not tagging any companies. π§΅
A blog post summarizing our year+-long work on teaching Rust ownership, spearheaded by
@tonofcrates
and Gavin Gray! Though there are two visualizations that are nice and effective, the post also points to the essential insights (which aren't those).
Don't patronize us,
@TheOfficialACM
. We're the "talent" that produce what you sell (and some of us are also the taxpayers who fund the talent). You had a choice and decided to come out on the wrong, wrong side of history. WE ARE STAKEHOLDERS IN EVERY WAY - where's OUR feedback?
@tdietterich
@esa
Just to clarify, ACM itself has a long and growing list of Open Access initiatives in support of our members and authors: But we support other publishers in protesting a regulatory change that doesnβt involve stakeholder feedback
Life in a college town (embedded in a medium-size city).
Based on my priors, I'd say there's a highly probability this is a faculty member's house. (Though we do also have professional biostatisticians and the like lurking about.)
Providence,
#RhodeIsland
Hugh Lauer's 1972 PhD, Correctness in Operating Systems, is a classic (and spawned great work). Until now it was not online. Hugh has spent two weeks scanning it. Make sure your students know about itβconcurrent verification wasn't invented last decade!
This had to be written sooner or later. I keep thinking I must have written this but I don't seem to have. I'll refine it over time, but here's a draft: How Not to Teach Recursion.
The people going on about alcohol at conferences: spend an extra two bucks instead so the vegetarian food isn't "pasta primavera" every day and actually has nutrition, flavor, and a hint of creativity, first. Ffs, the absolute worst thing about conferences.
Some folks are arguing that conferences *must* include alcohol because otherwise people will just go outside and get a drink. My model of conferences attendees is that they will go where their friends are, not where the alcohol is.
25/ I have nothing to add to the above. I'm just a tech nerd who worries about tech nerds getting too excited about tech and forgetting about humans. But I have a bit of an addendum about this whole display. β΅
1/ A remarkable book that uses the singular titular document to unlock a much bigger, mostly overlooked story: the US's role in the birth of Bangladesh. Bass argues that Nixon and Kissinger's treatment of Bangladesh has been forgotten by history but shouldn't be. β΅
#BookReview
Does anyone have serious, un-ironic, non-mocking ideas for what a coronavirus hackathon could produce of value? (Beyond the obvious "use this app to report a case in your area".) I'm coming up pretty blank; impress me.
Here's a productivity technique I've been using for a month and has worked really well. It reduces distraction, focuses effort, and sets targets. It assumes most of your tasks accumulate as emails. I'll explain it in a short thread. β΅
It's worth reading
@ArjunGuha
's "acceptance speech" of our Flapjax Test-of-Time Award, because it's basically half of PLMW in a few paragraphs. Grad students: take courage. Your instincts are right.
Delighted to release the third edition of Programming Languages: Application and Interpretation (PLAI). A complete rewrite based on many years of research and experience. Because that research is still ramping up, this is just the start of many changes.
The NYT piece about
@timnitGebru
confirms my hypothesis that her paper was going to embarrass Google (and maybe cause more gov't action). But it also exposes the oxymoron in "industrial research" and shows that there's another Desk Drawer Effect in science that we don't discuss.
Honestly, most excited
@sigplan
recognized
@KathiFisler
. I get ample notice but she's worked so well, so hard, on so much, but gets overshadowed.
So many others also earned this:
@racketlang
collabs,
@Bootstrapworld
co's,
@PyretLang
co's, former students. I'm just SO darn lucky.
For the past two years our kid has been using "FatNums", a slightly different numeral representation we came up with, for all her school arithmetic problems. She thought it'd be nice to write it up and share it. Here goes.
#parenting
Every sufficiently complex API *is* a language, it's a choice of whether to expose it as one or not.
My rule of thumb is, any time an API has a "beginX" and "endX" pair, it's basically a language that has chosen to allow { and } to not necessarily match up, with consequences.
Why PL β any library with a sufficiently complex API becomes a language.
I picture software systems on two sliding scales of internal complexity and external (interface) complexity, where as you move along the second scale, you go from library-ish, to DSL-ish, to language-ish.
For people who don't know the Butler Lampson story about the "weight of software", read his Draper Award talk (worth reading anyway; Butler changed my life). Here's the whole thing, but here's the relevant extract.
To people who have kindly inquired about my recovery: x-rays are good, permitted to start putting some weight on the formerly-broken leg.
1. I have forgotten how to walk and am re-learning.
2. Brain has spent 2.5 months keeping foot far from ground, and is now VERY confused.
OMG you people. There's a Google Sheets plugin that converts tables in Sheets into LaTeX tables. It's not perfect but it's still a game-changer: I'll never hand-create a table again. (The output is readable, so it's easy to clean up.) Thank you caenrigen.
I'm just grinning madly at the functional languagesβespecially the grandparents, OCaml, Haskell, and
@racketlang
βjust smashing the heck out of Python and Co.
But please, by all means do tell me about how inefficient functional programming is.
This table says it all. One of the reasons we're going big on Rust it because it delivers incredible economies, without trading off safety. On Cloud, sustainability is a big motivator, and on devices battery lifetime is the biggest differentiator.
2/ Rich and robust programming requires a strong understanding of data models and invariants. Python is weak at expressing either of those. You don't notice it until you miss it. β΅
Holy crap: an Ethereum contract programming language called β¦Β Pyramid Scheme. (And yes, of course, Pyramid Scheme is built using Racket. The secret of the long con is out.)
Ah, it's open season on long South Indian last names in *checks watch* 2023. Who needs Ann Coulter when we can get this from a supposedly reputed journalist.
Mom, a very skilled doctor, hearing of my COVID, asks 3 laser-focused medical questions:
- Have you eaten?
- Did you eat normally?
- Did you finish food?
Dad mumbles in background; Mom cuts him off: "No, no, he'll be fine, he's eating".
So it sounds like I'll be fine, y'all.