PicoBlog

Watching Everything Everywhere All at Once. I had a feeling that I haven’t had since I first watched Bo Burnham: Inside last year. The feeling that I was seeing something new. Not just something original, but a new thing. A new thing that seemed to be capturing something I had felt but hadn’t yet seen conveyed in art. There is something many of us are experiencing, but few are talking about.
Grooming is NOT what the transphobes and TERFS (or FARTS) want you to think it is. In the dictionary, grooming is the act or process of preparing someone to fill a position or role or undertake an activity.  You can "groom" a cashier to take over a management position in a store or groom a dog for a dog show. These definitions are nothing anyone would raise an eyebrow over. However, the OTHER kind of grooming (where people with bad intentions manipulate and coerce others for selfish purposes) has been in the news a LOT lately.
A couple of weeks ago, as I wandered the internet looking for art opportunities, I stumbled upon a call for art for a new hospital. It grabbed my attention, as hospitals generally have both money and enormous expanses of wall, things painters often lack. As I read on, though, my enthusiasm cooled. The curators had a long list of precise stipulations about the art they were looking for, and it didn’t include watercolor viruses:
Housekeeping: For Moment of Zen, we spoke to Marc Andreessen about AI, Religion, Longevity, The NPC meme, and more. For In the Arena, we spoke to Martin Shkreli about his rise and fall and his comeback arc. For Media Empires, we spoke to Steph Smith about the creator economy. For Econ 102, Noah Smith outlined libertarianism’s rise and fall and the need for it to rise again. In our post on political realignments, we talked about flippening between the democrats and the republicans.
For those paying attention, it is obvious that the cultural vanguard has long since moved on from the played-out tropes and predictable strategies of “postmodernism.” That story is old, and there is, by now, over a decade’s worth of academic literature devoted not just to postmodernism’s decline but to what has arisen to succeed it since the early 2000s. While the legacy of postmodernism will of course live on and continue to permeate society, it is hardly the spearpoint anymore of cultural innovation.
I’ve spent a lot of my career working in “Operations” in tech companies. This has resulted in spending a lot of time explaining to people what exactly it is I do. Unlike Sales or Product, there isn’t an immediately obvious, tangible outcome that Operations is associated with. Operations is one of those nebulous terms that’s everywhere, and yet there is no one agreed definition on it. In a selfish attempt to reduce the time I spend repeating myself, I’ve decided to write this overview to help unpack the mystery of Ops.
Hey Everyone, I’m not a developer but the Open-Source movement in LLMs is gaining some momentum in the Spring of 2023. From Meta AI’s LLaMA, to UC Berkley’s 7B OpenLLaMA model, an open-source alternative to Meta’s LLaMA language model. The model has been trained on the RedPajama dataset with 200 billion tokens, and its weights are available in PyTorch and Jax. With the latest release all non-commercial models stemming from LLaMA can now be re-trained with a permissive licence.
According to stereotype, analytic philosophers love nothing more than analyzing concepts, filling the ellipsis in x is F if and only if … with conditions held to be implicit in the meaning of a word. It’s an anachronistic vision, both because “analytic truth” plays a minimal role in contemporary philosophy—there’s more interest in “real definition,” the metaphysical project of explaining what it is to be F—and because philosophers are willing to treat concepts as primitive: undefined but well-understood.
In this article, we take a step back and get a closer look on what this SwiGLU is? How to code it? Why does it work? In deep learning, we use neural networks to learn behaviours and patterns in our data. Neural network seems like a complicated word but it is actually a bunch of parameters that we multiply to our input and get a result, we keep tweaking these parameters until they learn something useful.