<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0">
  <channel>
    <title>Kojin Glick</title>
    <description>Blog articles about technology and the state of the world by Kojin Glick.</description>
    <link>kojinglick.com</link>
    <lastBuildDate>Mon Jun 30 2025 16:04:18 GMT-0700 (Pacific Daylight Time)</lastBuildDate>
    <pubDate>Mon Jun 30 2025 16:04:18 GMT-0700 (Pacific Daylight Time)</pubDate>
    <language>en</language>
    <item>
      <title>The Hollowcene</title>
      <link>https://kojinglick.com/blog/hollowcene</link>
      <description>&lt;p&gt;A peculiar hollowness has settled over our collective experience, a phenomenon that I would like to call &quot;the Hollowcene.&quot; The term emerges as a necessary addition to our epochal lexicon, not merely as a clever neologism but as recognition of a profound shift in the texture of human existence, one that rivals the geological and biological transitions that have punctuated Earth&#39;s history.&lt;/p&gt;
&lt;p&gt;The evidence surrounds us, if we care to notice it. Consider the eerie familiarity of scrolling through online spaces that seem populated yet oddly vacant: digital town squares where conversation flows continuously but often without the friction of genuine human exchange. The phenomenon known as &quot;Dead Internet Theory&quot; posits that vast swaths of online content are now generated by algorithms rather than people, creating the simulacrum of human culture rather than culture itself. Similarly, the internet folklore of &quot;the Backrooms&quot; captures our collective unease: endless, identical yellow corridors of corporate carpet and buzzing fluorescent lights, a purgatory of the mundane where one might wander forever without encountering another soul. The scenes feel familiar but lost, a bowling alley from a late 90s early 00s birthday party, yellowish wallpaper on a dentist&#39;s walls.&lt;/p&gt;
&lt;p&gt;What distinguishes the Hollowcene from mere cultural malaise is its ontological dimension. Just as the Anthropocene marks humans becoming a geological force, reshaping the physical and biological world, the Hollowcene represents a fundamental alteration in how culture itself is constituted. The late cultural theorist Mark Fisher identified this condition as &quot;hauntology,&quot; a state in which we find ourselves haunted by futures that never arrived, endlessly recycling nostalgic visions of pasts that never existed. We have, in effect, become unstuck in time, unable to imagine genuine futures while constantly revisiting simulations of memory.&lt;/p&gt;
&lt;p&gt;The significance of naming this condition extends beyond academic exercise. The hollowing out of experience represents a rupture as consequential as the advent of agriculture or the Industrial Revolution: a restructuring of consciousness itself. When algorithmic entities can mimic human creativity, when digital spaces feel simultaneously crowded and abandoned, when the uncanny valley expands to encompass entire realms of interaction, we have entered a new relationship with reality that demands recognition.&lt;/p&gt;
&lt;p&gt;Our experience has become increasingly mediated, creating a persistent layer of removal from direct engagement. This mediation is not merely technological but ontological, a transformation comparable to how writing once reshaped human cognition. The pervasive sense of unease, the feeling of navigating through empty replicas of meaning, constitutes a shared psychological condition that defines our era. We find ourselves like explorers in the Backrooms, surrounded by the familiar made strange, sensing that something fundamental has been extracted from beneath the surface of experience. In the Backrooms, the comforts of the past linger past their welcome. They&#39;re not supposed to be here, and yet their forceful assertion in this non-place makes us question the soothing sensation of familiarity.&lt;/p&gt;
&lt;p&gt;In the Hollowcene, individuals have been given everything they want, so long as they allow a wall between ourselves and others. As long as you allow Meta, or X, or Snap mediate your relationship, your dopamine system will never go unused. But after a while, we started to mistrust the strange sounds from beyond the wall.&lt;/p&gt;
&lt;p&gt;Indeed, the parallel to the Greek underworld is uncannily apt. We have become shades wandering the asphodel meadows, each of us insubstantial, each passing through one another without true contact. Like the souls in Hades who drink from Lethe to forget their former vitality, we sip continuously from streams of content that provide momentary salience while eroding memory. The mythic river Styx that separates the living from the dead finds its contemporary analogue in the interfaces that promise connection while ensuring separation, offering simulacra of intimacy that leave us perpetually unfulfilled. Plato&#39;s allegory of the cave finds renewed relevance as we mistake shadows on walls for reality, arguing fiercely about projections while remaining chained in place. The Hollowcene has transformed us into Tantalus figures, forever reaching for human connection that remains just beyond grasp, or perhaps into Sisyphean strivers, endlessly pushing the boulder of our curated personas uphill only to watch them roll back down with each refresh. What the ancients understood as divine punishment, we have somehow reimagined as technological progress, building ever more elaborate systems to maintain our isolation while convincing ourselves that we have never been more connected. The phantoms and wraiths of Greek mythology now take digital form, as we increasingly interact with algorithmic approximations of humanity, wondering why the exchanges leave us feeling so bereft of warmth.&lt;/p&gt;
&lt;p&gt;The Hollowcene, then, represents not just cultural critique but an attempt to understand a critical juncture in human civilization, one that will leave as permanent a mark as any geological shift, even if its traces are not visible in strata of rock but in the architecture of consciousness itself. To define it is to begin the necessary work of understanding how to exist meaningfully within it, or perhaps how to find our way out of those endless, empty corridors.&lt;/p&gt;
</description>
      <pubDate>Fri May 16 2025 19:14:34 GMT-0700 (Pacific Daylight Time)</pubDate>
    </item>
    <item>
      <title>The Intentional Toolmaker</title>
      <link>https://kojinglick.com/blog/intentional-toolmaker</link>
      <description>&lt;p&gt;In the grand tradition of design philosophies, from the Unix credo to the Bauhaus ethos, there emerges a new paradigm for crafting tools that harmonize human and machine intelligence. We propose Intentional Tooling as a guiding principle for innovation – an approach that seeks to augment, rather than diminish, our capacity for intentional thought.&lt;/p&gt;
&lt;p&gt;The tool-user dyad is not a zero-sum game, where one&#39;s gain is the other&#39;s loss. Rather, it&#39;s a symbiotic dance, where each partner informs and elevates the other. The pencil, that humble harbinger of creativity, exemplifies this synergy. It’s often said that notes are better remembered if they were written down, rather than typed or stored in our constantly-overwritten mental RAM. By our physical intervention on a medium that maps shapes to concepts, the pencil’s simplicity belies the complex mental calculations it enables – a testament to the transcendent power of human imagination.&lt;/p&gt;
&lt;p&gt;But as we&#39;ve progressed from physical to digital tools, our design philosophies have often prioritized expediency over intentionality. We&#39;ve created instruments that amplify our baser tendencies, rather than nurturing our highest cognitive faculties. The social media feedback loop, for instance, has distilled interaction into a shallow distillate of clicks and likes – an alembicated reduction that belies the depths of cognitive transformation it&#39;s wrought upon us.&lt;/p&gt;
&lt;p&gt;Here, we must acknowledge our own fallibility in mediating humanity&#39;s worst instincts. Be they our most evil thoughts, or our captured public institutions, we have failed to effectively align humans, let alone our generative creations. We&#39;re not omniscient judges of machine intelligence; rather, we&#39;re flawed collaborators, striving to create tools that elevate our shared potential.&lt;/p&gt;
&lt;p&gt;Intentional Tooling seeks to reverse this trend by designing tools that promote, rather than degrade, intentionality. This means crafting instruments that:&lt;/p&gt;
&lt;ol&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Amplify Cognitive Frictions&lt;/strong&gt;: Tools should introduce cognitive dissonance, prompting users to reconcile conflicting ideas and forge novel connections.&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Encourage Embodied Cognition&lt;/strong&gt;: By incorporating haptic feedback, spatial reasoning, and sensorimotor integration, tools can foster a more embodied understanding of complex concepts.&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Nurture Metacognition&lt;/strong&gt;: Tools should facilitate self-awareness, enabling users to recognize their own thought processes and adjust them in real-time.&lt;/p&gt;
&lt;/li&gt;
&lt;/ol&gt;
&lt;p&gt;By embracing Intentional Tooling, we can co-create a future where human and machine intelligence converge in a symbiotic pas de deux. Our tools will no longer be mere extensions of ourselves but rather collaborative partners that augment our intentionality. In this way, we&#39;ll ensure that the next revolution in tool-mediated evolution is one that elevates, rather than diminishes, our shared humanity.&lt;/p&gt;
&lt;p&gt;The Intentional Toolmaker&#39;s manifesto reads: &quot;Design for human-machine intelligence advancement, not degradation. Craft tools that amplify cognitive frictions, encourage embodied cognition, and nurture metacognition. Together, let us forge a future where intentionality flourishes – for in this dance between tool and wielder lies the very future itself, awaiting our creative touch.&lt;/p&gt;
</description>
      <pubDate>Mon Feb 17 2025 22:52:55 GMT-0800 (Pacific Standard Time)</pubDate>
    </item>
    <item>
      <title>DeepSeek and Monopoly Economics</title>
      <link>https://kojinglick.com/blog/deepseek-and-monopoly-economics</link>
      <description>&lt;p&gt;When a model makes headlines like this, the media’s “Breaking News” tunnel vision often obscures what’s important. Between the Mixture of Experts architecture, the Chain of Thought capabilities possessed by the R1 model, there’s a lot to discuss about the model in isolation. But DeepSeek hasn’t done anything particularly novel by itself. DeepSeek’s challenge to ChatGPT is fundamentally about the future of the AI economy, and whether the walls built by the partnership between OpenAI and the new administration are high enough to keep challengers, Chinese or otherwise, out.&lt;/p&gt;
&lt;h3&gt;What DeepSeek &lt;em&gt;does&lt;/em&gt;&lt;/h3&gt;
&lt;p&gt;DeepSeek, as a family of models, uses an architecture called Mixture of Experts (MoE), &lt;a href=&quot;https://mistral.ai/news/mixtral-of-experts/&quot;&gt;originally pioneered by Mistral in France&lt;/a&gt;. In a Mixture of Experts model, parts of the neural network are specialized to certain tasks, and a traffic controller, sometimes called a router or a load balancer, directs certain requests to its entourage of “Expert” neural networks. When it generates new text, it’s also far more efficient because it doesn’t use the entire network in generation.&lt;/p&gt;
&lt;h3&gt;What DeepSeek &lt;em&gt;did&lt;/em&gt;&lt;/h3&gt;
&lt;p&gt;DeepSeek scares Silicon Valley, not only because of what it does, but how it got there. DeepSeek trained a 607B parameter chat model, like OpenAI’s gpt-4o, and a 70B distilled reasoning model, like OpenAI’s gpt-o1. If we assume that the unit cost of a GPU-hour was around $2, &lt;a href=&quot;https://arxiv.org/pdf/2412.19437v1&quot;&gt;the price tag for training them was $5.576 million&lt;/a&gt;. Though we don’t know exactly how much OpenAI spends on their models, we do know that in the year that OpenAI released gpt-4o, they spent $5 billion.&lt;/p&gt;
&lt;p&gt;These discounts are passed to AI entrepreneurs in the almost 10x reduction in price to use the models for enterprise purposes. It is cheaper, by an order of magnitude, to build your AI product idea on DeepSeek than it is on ChatGPT. Some of this cost-reduction can be explained by the efficiency of training many domain experts rather than one massive oracle, but the shock in Silicon Valley and Wall Street reveals the emperor’s lack of clothing.&lt;/p&gt;
&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th align=&quot;center&quot;&gt;Model&lt;/th&gt;
&lt;th align=&quot;center&quot;&gt;Input Tokens* (per million)&lt;/th&gt;
&lt;th align=&quot;center&quot;&gt;Output Tokens (per million)&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;&lt;tr&gt;
&lt;td align=&quot;center&quot;&gt;&lt;a href=&quot;https://api-docs.deepseek.com/quick_start/pricing/&quot;&gt;deepseek-chat&lt;/a&gt;&lt;/td&gt;
&lt;td align=&quot;center&quot;&gt;$0.07 - $0.27**&lt;/td&gt;
&lt;td align=&quot;center&quot;&gt;$1.10**&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td align=&quot;center&quot;&gt;&lt;a href=&quot;https://api-docs.deepseek.com/quick_start/pricing/&quot;&gt;deepseek-reasoner&lt;/a&gt;&lt;/td&gt;
&lt;td align=&quot;center&quot;&gt;$0.14 - $0.55&lt;/td&gt;
&lt;td align=&quot;center&quot;&gt;$2.19&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td align=&quot;center&quot;&gt;&lt;a href=&quot;https://openai.com/api/pricing/&quot;&gt;gpt-4o&lt;/a&gt;&lt;/td&gt;
&lt;td align=&quot;center&quot;&gt;$1.25 - $2.50&lt;/td&gt;
&lt;td align=&quot;center&quot;&gt;$10.00&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td align=&quot;center&quot;&gt;&lt;a href=&quot;https://openai.com/api/pricing/&quot;&gt;gpt-o1&lt;/a&gt;&lt;/td&gt;
&lt;td align=&quot;center&quot;&gt;$7.50 - $15.00&lt;/td&gt;
&lt;td align=&quot;center&quot;&gt;$60.00&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;&lt;/table&gt;
&lt;p&gt;* The two prices represent the difference in cost of inputting a token that the large language model is already familiar with, vs a token that is new to the large language model.&lt;br&gt;** Expected cost after the promotional pricing that ends Feb 8.&lt;/p&gt;
&lt;h3&gt;The price of monopoly is the entire market&lt;/h3&gt;
&lt;p&gt;Wall Street and Silicon Valley are being faced with an important crossroads: continue pursuing monopoly economics in AI and lose the wholesale market to China, or prioritize their existing bottom line by nurturing a robust domestic AI economy.&lt;/p&gt;
&lt;p&gt;The companies that make up “Silicon Valley” act predictably when they search for new markets to capture. When distributed systems became the standard computation paradigm in the 2010s, Silicon Valley was able to capture the market by turning the Cloud into economies of scale: branded data-centers, billions in broadband investment, and bulk priced access to their walled gardens. After OpenAI launched ChatGPT in 2022, investors predicted that AI adoption would follow Cloud adoption.&lt;/p&gt;
&lt;p&gt;Huge losses are expected in these endeavors, since the real reward is the capture of the market. Since the typical threat to the monopolistic corporation is the antitrust enforcement of the government, Sam Altman and OpenAI have been hard at work trying to capture regulators through &lt;a href=&quot;https://openai.com/index/announcing-the-stargate-project/&quot;&gt;the Stargate Project&lt;/a&gt;. Between aligning the federal government’s pocketbook with their long term ambition to own and operate the infrastructure of AI in the United States, the current administration and OpenAI are making a bet that the future of AI is the same thing as the future of OpenAI.&lt;/p&gt;
&lt;p&gt;If DeepSeek can operate at 10% the cost of OpenAI, where is the use for OpenAI, and OpenAI-branded datacenters? At the core of the Stargate Project is the assumption that AI entrepreneurs are only going to use OpenAI services to build the next generation of AI tooling. But that assumption can not stand a 90% cheaper competitor. Unless OpenAI is willing to reduce potential revenue by price matching, doubling down on the OpenAI monopoly is tantamount to handing a majority of the AI infrastructure market to China.&lt;/p&gt;
&lt;p&gt;In order to keep AI entrepreneurial dollars in the United States, OpenAI and the new administration must start investing in open infrastructure. This consists of open research and a clear, simple path to competitiveness for smaller players.&lt;/p&gt;
</description>
      <pubDate>Wed Jan 29 2025 10:39:58 GMT-0800 (Pacific Standard Time)</pubDate>
    </item>
    <item>
      <title>Audience: The Currency of Technology</title>
      <link>https://kojinglick.com/blog/audience-the-currency-of-tech</link>
      <description>&lt;p&gt;The “technology” company is an unfortunate misnomer for what a tech company really is: an audience company. This is particularly true of any company involved in social media, but this trend is true at a more abstract level for all “technology” companies. Producing novel technologies is highly costly. Therefore, in the currently dominant mode of economic production today, all technologies produced are compensated by the exchange of capital for audiences. In other words, all technology companies are only in the business of “technology” as it pertains to securing enough users or customers to justify their acquisition. When a technology company interfaces with a venture capitalist firm, or a larger firm who wants to acquire them, capital is offered in proportion to the value of that technology’s audience.&lt;/p&gt;
&lt;p&gt;From the perspective of tech companies, audiences - digital communities, populations, or market segments - are only valuable insofar as they accurately predict the consumption behavior of a group of individuals. Audience-makers seek to balance the trade-offs of two important features: audience size, and audience cohesion. Audience size determines the number of individuals who are given consumption signals. Audience cohesion determines the rate at which the audience will reliably respond to the consumption signals with a purchase. Audience cohesion also determines the extent to which a statement about a consumer applies to the entire audience. When audience sizes get too large, though, audience cohesion suffers, as scale limits the amount of commonality shared by individuals in this audience. When audience cohesion is too refined, audience size suffers, as the specificity of statements about audience members disqualify more and more people from the audience. While digital platforms certainly did allow the support for an unprecedented scale of audience-collection, this dynamic largely has had the same overall shape since the advent of market segmentation and cable television.&lt;/p&gt;
&lt;p&gt;Content programming - the sequencing of media texts into regular time slots - is a form of audience making. In the era of cable television, audiences were formed by individuals tuning in, every week, for certain content programming provided by cable networks. From the perspective of cable watchers, content programming helps individuals build a routine to consume their favorite shows. From the perspective of the cable network, content programming creates virtual waypoints for market segments to congregate around. When individuals tune in to the same set of content programs, advertisers can make statements describing these individuals. Deals between advertisers and cable networks are made when an advertiser’s target audience matches a market segment established by a cable network.&lt;/p&gt;
&lt;p&gt;Algorithmic feeds - the sequencing of media texts into an infinite-scroll feed - is a form of audience-making enabled by digital technology. In the era of social media companies and digital streaming, the relationship between individuals and the content they want to see is mediated by algorithms. For users, algorithms help users consume the content in which they have previously expressed interest. For tech companies, algorithms help steer individual users into behavioral groups according to how users express their interests. Algorithms help shape individual users on a platform into market segments. The digital nature of algorithmic audience-making allows companies to instantly combine and recombine user behaviors into countless permutations of market segments, creating highly specific audience lists at unprecedented scale. Moreover, unlike traditional content programming where audience segments were manually identified and tracked, algorithmic systems continuously and automatically update these audience lists in real-time as users interact with content, enabling dynamic and precise audience targeting.&lt;/p&gt;
&lt;p&gt;AI-generated content - the sequencing of associative references into coherent, human-like media texts - represents the most recent manifestation of audience-making. Where algorithmic lists organize individuals into audiences by their association to discrete textual corpora like posts, creators or even hashtags, Transformer-based AI systems atomize this process further. Large Language Models have the capacity to ingest massive swaths of content at an inhuman scale, identify latent or hidden relationships between corpora, and then use those associations to produce convincingly human text. For users, content is now immanent to the digital platform itself; no longer is there a need for a creative producer of content to be at the other end of the digital platform. For tech companies, the latent space of the AI systems which stores the learned relationships between corpora becomes a map of content to users. When users interact with AI-generated content, they are simultaneously consuming and training the system on their preferences, creating a recursive loop of audience refinement that collapses the distinction between content creation and consumption. The business relationship between platforms and advertisers transforms as well - rather than matching pre-existing audiences to content, AI systems can dynamically generate content optimized for any desired audience segment, making the audience-content relationship fully programmable.&lt;/p&gt;
&lt;p&gt;Technology companies may be simply audience companies, but how can one discount the degree to which the fundamental technology has changed, despite fulfilling the same strategic need? From negotiating market segments using Nielsen ratings to a fully-automated manager of conceptual preferences, the shape of audience-making has changed drastically, but the purpose of audience-making has not changed in the least.&lt;/p&gt;
</description>
      <pubDate>Mon Jan 20 2025 16:14:20 GMT-0800 (Pacific Standard Time)</pubDate>
    </item>
    <item>
      <title>Open Source vs Open Weights: The AI Transparency Paradox</title>
      <link>https://kojinglick.com/blog/open-source-vs-open-weights</link>
      <description>&lt;!--in progress--&gt;
&lt;!--_Read my complete thoughts, [here](/static/pdf/open-source-vs-open-weights-full.pdf)_--&gt;

&lt;p&gt;When Meta released their Llama 3.1 language models in July 2024, Mark Zuckerberg proclaimed it as a triumph for &quot;open source AI.&quot; But while the model&#39;s architecture and weights were made public, its training data remained hidden. This raises a crucial question: what does &quot;open source&quot; really mean for artificial intelligence?&lt;/p&gt;
&lt;p&gt;In traditional software, open source creates a specific relationship between creators and users. When you can read the source code, you know exactly how a program makes decisions. This transparency enables collaboration, faster bug detection, and continuous improvement. But AI turns this model on its head.&lt;/p&gt;
&lt;p&gt;The fundamental challenge lies in two critical differences. First, while source code tells you everything about how traditional software works, an AI model&#39;s behavior is equally—if not more—shaped by its training data. Imagine having a building&#39;s blueprint but not knowing what materials were used to construct it. The corporate reluctance to share training data isn&#39;t just about maintaining competitive advantage. These datasets often contain a complex mix of copyrighted material, personal information, and data collected under ambiguous terms.&lt;/p&gt;
&lt;p&gt;Second, traditional software is deterministic—you can trace any output back to specific lines of code. AI models, with their millions or billions of parameters, don&#39;t work that way. Small changes in training data can cause unpredictable ripples throughout the model&#39;s behavior. This breaks a fundamental assumption that makes open source collaboration work: the ability to understand how changes affect the program&#39;s behavior.&lt;/p&gt;
&lt;p&gt;The beauty of open source software lies in &quot;Linus&#39; Law&quot;: given enough eyeballs, all bugs are shallow. But for this principle to work in AI, we need both transparent datasets and better interpretability tools. These aren&#39;t separate challenges—they&#39;re two sides of the same coin. Simply releasing model weights while keeping training data secret, as Meta has done with Llama 3.1, falls short of true openness.&lt;/p&gt;
&lt;p&gt;We need innovative approaches that balance transparency with practical constraints. Representative, anonymized dataset samples could enable meaningful research without exposing sensitive information. Trusted third-party audits could ensure ethical standards without requiring full public disclosure. And advances in interpretability research could help us understand AI systems even when we can&#39;t trace their logic line-by-line.&lt;/p&gt;
&lt;p&gt;Just as open source software built an ecosystem of programmers who collaborate daily, we need to build a new framework for AI development—one that enables meaningful oversight while acknowledging the unique challenges of building neural networks at scale.&lt;/p&gt;
&lt;!--_[Read the Long-Form Article](/static/pdf/open-source-vs-open-weights-full.pdf)_--&gt;
</description>
      <pubDate>Mon Jan 13 2025 21:49:44 GMT-0800 (Pacific Standard Time)</pubDate>
    </item>
    <item>
      <title>Part One: A Flatland Parable on Dimensionality</title>
      <link>https://kojinglick.com/blog/motio-est-omne-divisa-in-dimensiones-tres</link>
      <description>&lt;p&gt;&lt;em&gt;crossing dimensions is less fantastical that you think&lt;/em&gt;&lt;/p&gt;
&lt;pre&gt;&lt;code&gt;This series is an attempt to capture the journey
to better understand weaponized linear algebra
as a thinker of words first, and numbers second.


These posts are dedicated to Maria,
who is subject to a far rougher version
of my thoughts than what appears here.
&lt;/code&gt;&lt;/pre&gt;
&lt;p&gt;The most simple way humans experience dimensionality in information is via the fact that motion information exists in three-and-a-half dimensions. The main three components, describing spatial information, are the “x-dimension,” or the space computed, measured, and finally captured by the English word &lt;em&gt;width&lt;/em&gt;, the “y-dimension,” captured by the English word, &lt;em&gt;height&lt;/em&gt;, and the “z-dimension,” captured by the English word, &lt;em&gt;depth&lt;/em&gt;. Like the other three spatial dimensions, we can measure it. But unlike the other three where we measure a value that involves computation, and we only get to observe a discrete slice of the “time-dimension” for our conceptual model of motion. We don’t actually traverse &lt;em&gt;time&lt;/em&gt; in a &lt;em&gt;moment&lt;/em&gt; like we traversed the x-, y-, or z-dimension to measure the earlier &lt;em&gt;width&lt;/em&gt;, &lt;em&gt;height&lt;/em&gt;, and &lt;em&gt;depth&lt;/em&gt;.&lt;/p&gt;
&lt;p&gt;For a moment, let’s consider a different situation. Rather than three computed values (x, y, z) and one observed value (time), we only have two (x, y) of the previous three computed values, and the same observed value (time). The following scenario may be more familiar to the Physics student as Flatland. In Flatland, Flatlanders experience motion in two and a half dimensions. Imagine drawing a Flatlander on a piece of paper, and then drawing a circle around them. It would be impossible for the Flatlander to move outside of that circle according to the laws of motion in Flatland. This is because the Flatlander can not use &lt;em&gt;depth&lt;/em&gt; to leap over what appears to us as a line on a surface. Now, feeling a little guilty for abusing higher-dimensional powers and imprisoning a Flatlander, how could we, three and a half dimensional beings, prove our existence to these Flatlanders, even just to apologize?&lt;/p&gt;
&lt;p&gt;Even removing the question of the existence of intelligent beings in these dimensions, could we prove to Flatlanders even the existence of higher dimensions? What miracle could we perform, with our powers as higher dimensional beings, to the Flatlanders to convince them of our existence? Let us draw them other objects in their world with the pencil we used to ensnare the first Flatlander we encountered. We noticed that there was nothing on the Flatlander’s head. While the Flatlander is astonished to see a hat, a sideways P, appear in their reality and cover their nose with its brim, the conclusion that this has anything to do with the existence of higher dimensions is still a major leap. The pencil, never truly crossing the plane represented by the piece of paper, still remains an extra-dimensional agent, completely outside of the Flatlander’s conception of reality.&lt;/p&gt;
&lt;p&gt;What we need to do is to fully utilize the dimension of which the Flatlander is blissfully unaware, &lt;em&gt;depth&lt;/em&gt;, since moving the sideways P around and rotating it doesn’t do much to convince the Flatlander of the existence of other dimensions. We need to do something that is impossible according to the laws of motion in Flatland. We reach into Flatland, spinning the hat so that the brim rotates around the cap until the brim faces backwards leaving their nose uncovered by the brim. This miracle truly astonishes the Flatlander. According to other Flatlanders observing the newly hatted Flatlander, the brim started to shrink until the hat was a skull cap, and then the brim slowly started to grow on the other side of the Flatlander※.&lt;/p&gt;
&lt;p&gt;Together we built a very small spatial model in our minds by imagining this scenario with the Flatlanders. Very similar mathematics are central to the architecture of most state-of-the-art Large Language Models. To summarize with a mathematical perspective, we projected an object with spatial information in two-dimensional space to an object with spatial information in three-dimensional space, spun it in a way only possible with a third controllable value, and then reduced it back to an object in two-dimensional space. In other words, we unfolded a new value of information inherent to the world that was hidden to Flatlanders, so that we could transform the hat, before folding that value back into the Flatlander’s native two-dimensionality. From the Flatlander’s perspective, the key to the solution of multiple universes lies in the transformations that occurred in a space that is already hard for them to conceive of using the rules of their environment.&lt;/p&gt;
&lt;p&gt;In the rest of the series, we will examine the outputs from Large Language Models like Flatlanders. Just like the Flatlander who is aware of some spatial transformation happening in a space they can&#39;t see, we know that &lt;em&gt;something&lt;/em&gt; happens to words in a higher dimensional semantic space that allows Large Language Models to, well, &lt;em&gt;model language&lt;/em&gt; so effectively. Keep in mind this simple conceptual model we created involving Flatlanders, as we will return to this simple world repeatedly to understand language models. Stay tuned!&lt;/p&gt;
&lt;p&gt;※ Some Flatlanders begin to worship the Flatlander with the hat, thinking that these powers were unique to the One with the Hat. Other Flatlanders argued that it was an external and exacting power that spoke through the Behatted One. This caused a great deal of disagreement and often resulted in violence. For a while, many Flatlanders adopted one set of beliefs, not out of ideological conviction, but in fear of being prosecuted by roving bands made up of those who had committed to one belief. Still, there were other Flatlanders still who disagreed with the whole premise, because if there truly were great and mysterious higher powers, why didn’t they free the recently Imprisoned One.&lt;/p&gt;
</description>
      <pubDate>Thu Dec 19 2024 18:43:22 GMT-0800 (Pacific Standard Time)</pubDate>
    </item>
    <item>
      <title>Updates</title>
      <link>https://kojinglick.com/blog/update-2024</link>
      <description>&lt;p&gt;Been a minute, eh? I feel like it has.&lt;/p&gt;
&lt;p&gt;I had three serious looking jobs, one of which I have now. &lt;/p&gt;
&lt;p&gt;I started the year as essentially a research monkey staring down the barrel of automation. The worst part is when I offered to help them automate, they gave me a banana and reminded me how shiny my computer was. I lasted around a month.&lt;/p&gt;
&lt;p&gt;For the next week after that, I was driving into San Francisco to do a trial week with a bio-tech company as a software engineer. Bombed that harder than I could imagine. But I still got paid $500 per day. It&#39;s strange feeling both like I messed up, but still compensated at a higher annualized rate than I ever have.&lt;/p&gt;
&lt;p&gt;Finally, I&#39;ve settled where I expected to, and really where my skills belong. I&#39;ve been working at CTEC full-time for a little more than six months. I do anything involving computers, which is the role I want. I spin up databases, create web-based services, do basic network analysis and statistics to back up what I&#39;m saying.&lt;/p&gt;
&lt;p&gt;I&#39;m happy here. Took me a long time to rebuild my confidence in my abilities after the week of high-status failure. But as the season reminds us, I was born a varmint, and a varmint I will always be. Nothing wrong with that.&lt;/p&gt;
</description>
      <pubDate>Sun Oct 27 2024 10:35:57 GMT-0700 (Pacific Daylight Time)</pubDate>
    </item>
    <item>
      <title>Using HTMX with Rust - A Quickstart Guide</title>
      <link>https://kojinglick.com/blog/using-htmx-with-rust-quickstart</link>
      <description>&lt;h2&gt;Introduction&lt;/h2&gt;
&lt;p&gt;If you&#39;ve been following web development, you&#39;ve probably seen the hype about &lt;a href=&quot;https://htmx.org/&quot;&gt;htmx&lt;/a&gt;. In simple terms, it&#39;s a front-end web development framework in which the goal is to minimize the amount of javascript that gets shipped to the browser. The philosophy behind it seeks to return to the golden age of hypermedia-driven applications, which means that rather than receiving JSON data from the server, you receive whole html blocks that get swapped into your application with lightening speed. &lt;/p&gt;
&lt;p&gt;This tutorial is a simple quickstart guide for building your first htmx application using Rust. Specifically, we&#39;ll be using the &lt;a href=&quot;https://actix.rs/docs/&quot;&gt;actix-web server framework&lt;/a&gt; for serving our requests, as well as the &lt;a href=&quot;https://keats.github.io/tera/docs/&quot;&gt;Tera templating engine&lt;/a&gt; to render the actual html. Follow along, or clone the repository &lt;a href=&quot;https://github.com/moonstripe/actix_htmx&quot;&gt;here&lt;/a&gt;. &lt;/p&gt;
&lt;p&gt;In order to follow along, you&#39;ll only need to install &lt;a href=&quot;https://www.rust-lang.org/tools/install&quot;&gt;Rust&lt;/a&gt;. Optionally, you can make use of my favorite css framework, &lt;a href=&quot;https://tailwindcss.com/docs/installation&quot;&gt;tailwindcss&lt;/a&gt;. You&#39;ll need to install &lt;a href=&quot;https://nodejs.org/en/download&quot;&gt;Nodejs&lt;/a&gt; to add tailwind to your application. Let&#39;s get started!&lt;/p&gt;
&lt;h2&gt;Getting Started&lt;/h2&gt;
&lt;p&gt;To get started, initialize a new cargo project in the command-line terminal of your choice.&lt;/p&gt;
&lt;pre&gt;&lt;code&gt;cargo new actix_htmx
&lt;/code&gt;&lt;/pre&gt;
&lt;p&gt;This will create a directory called &lt;code&gt;actix_htmx&lt;/code&gt;. It should look like this:&lt;/p&gt;
&lt;pre&gt;&lt;code&gt;actix_htmx
+-- src
|   +-- main.rs
+-- Cargo.toml
&lt;/code&gt;&lt;/pre&gt;
&lt;p&gt;To add the required Rust dependencies, we&#39;ll change directory to &lt;code&gt;actix_htmx&lt;/code&gt;, and use &lt;code&gt;cargo add&lt;/code&gt; to include them in our Cargo.toml:&lt;/p&gt;
&lt;pre&gt;&lt;code&gt;cd actix_htmx
cargo add actix actix-web actix-files env_logger log tera
&lt;/code&gt;&lt;/pre&gt;
&lt;p&gt;This will add the required dependencies for the actix-web server framework, logging utilities, and the tera templating engine.&lt;/p&gt;
&lt;p&gt;Next, let&#39;s set up the &lt;code&gt;/templates&lt;/code&gt; directory, which will house our html templates.&lt;/p&gt;
&lt;pre&gt;&lt;code&gt;mkdir templates
touch templates/main.html
&lt;/code&gt;&lt;/pre&gt;
&lt;p&gt;In the IDE of your choice, add the following boilerplate to your &lt;code&gt;main.html&lt;/code&gt;.&lt;/p&gt;
&lt;pre&gt;&lt;code&gt;&lt;!DOCTYPE html&gt;

&lt;head&gt;

	&lt;title&gt;actix htmx&lt;/title&gt;
	
	&lt;script src=&quot;https://unpkg.com/htmx.org@1.9.10&quot; integrity=&quot;sha384-D1Kt99CQMDuVetoL1lrYwg5t+9QdHe7NLX/SoJYkXDFfX37iInKRy5xLSi8nO7UC&quot; crossorigin=&quot;anonymous&quot;&gt;&lt;/script&gt;

&lt;/head&gt;

&lt;body&gt;

&lt;h1&gt;Hello World&lt;/h1&gt;

&lt;/body&gt;
&lt;/code&gt;&lt;/pre&gt;
&lt;p&gt;This will be the entry point for your web application. Finally, we&#39;re ready to implement that actix-web server in Rust! &lt;/p&gt;
&lt;h2&gt;A Basic Actix-Web Server&lt;/h2&gt;
&lt;p&gt;In &lt;code&gt;src/main.rs&lt;/code&gt;, replace the existing code with this:&lt;/p&gt;
&lt;pre&gt;&lt;code&gt;use actix_web::{get, App, web::Data, HttpResponse, HttpServer, Responder};

use tera::{Tera, Context};


#[get(&quot;/&quot;)]

async fn home(tera: Data&lt;Tera&gt;) -&gt; impl Responder {

	HttpResponse::Ok().body(tera.render(&quot;main.html&quot;, &amp;Context::new()).unwrap())

}

  

#[actix::main]

async fn main() -&gt; std::io::Result&lt;()&gt; {

	env_logger::init();
	
	log::debug!(&quot;Starting Server&quot;);
	
	  
	
	let tera = Data::new(Tera::new(&quot;./templates/**/*.html&quot;).unwrap());
	
	  
	
	HttpServer::new( move || {
	
			App::new()
			
			.app_data(tera.clone())
			
			.service(home)
		
		})
		
		.bind((&quot;127.0.0.1&quot;, 8000))?
		
		.run()
		
		.await
	
	}
&lt;/code&gt;&lt;/pre&gt;
&lt;p&gt;Let&#39;s go through what this is doing. First, we import the required systems from our dependencies. Then we define a function that runs when the / path is hit with a GET request. This function returns an &lt;code&gt;HttpResponse&lt;/code&gt; of OK (status code 200). In the body of that response is a string that renders the &lt;code&gt;main.html&lt;/code&gt; template with an empty context. Finally, in the main function, we start an HTTP Server that runs our application. This application passes around the tera rendering so that all of our routes can render html with the &lt;code&gt;.app_data()&lt;/code&gt; method. With the &lt;code&gt;.service()&lt;/code&gt;, our application provides access to the &lt;code&gt;/&lt;/code&gt; route, which stores the &lt;code&gt;home&lt;/code&gt; function. This server is bound to the IP address 127.0.0.1, and the 8000 port. We can access this server with our browser at &lt;code&gt;localhost:8000/&lt;/code&gt; when we run the following command.&lt;/p&gt;
&lt;pre&gt;&lt;code&gt;RUST_LOG=debug cargo run
&lt;/code&gt;&lt;/pre&gt;
&lt;p&gt;You should see a web page with a header tag that reads &quot;Hello World&quot;! That being said, we definitely don&#39;t need htmx for this example, so let&#39;s add some more functionality.&lt;/p&gt;
&lt;h2&gt;Adding Counter Functionality&lt;/h2&gt;
&lt;p&gt;Let&#39;s implement a feature that allows the user to increment and decrement a counter, starting at zero. In our &lt;code&gt;templates/main.html&lt;/code&gt;, replace the &lt;code&gt;&lt;body&gt;...&lt;/body&gt;&lt;/code&gt; with the following snippet.&lt;/p&gt;
&lt;pre&gt;&lt;code&gt;... rest of templates/main.html ...
&lt;body&gt;
	
	&lt;p id=&quot;counter&quot;&gt;Counter: {{ counter_value }}&lt;/p&gt;
	
	  
	
	&lt;button hx-get=&quot;/increment&quot; hx-target=&quot;#counter&quot;&gt;increment&lt;/button&gt;
	
	  
	
	&lt;button hx-get=&quot;/decrement&quot; hx-target=&quot;#counter&quot;&gt;decrement&lt;/button&gt;

&lt;/body&gt;
&lt;/code&gt;&lt;/pre&gt;
&lt;p&gt;What&#39;s going on here? In our &lt;code&gt;&lt;p&gt;...&lt;/p&gt;&lt;/code&gt; tag, we&#39;re using a piece of context that we&#39;ll provide in our &lt;code&gt;home&lt;/code&gt; function of our Rust code. We&#39;ll see in a moment how that&#39;s implemented. Button immediately below includes our first piece of htmx magic. Rather than using AJAX written in a javascript file, htmx elements can fetch data by themselves! &lt;code&gt;hx-get&lt;/code&gt; creates a GET request to the corresponding endpoint, here &lt;code&gt;/increment&lt;/code&gt;. In &lt;code&gt;hx-target&lt;/code&gt;, we specify where we want the response to go, here replacing our counter &lt;code&gt;&lt;p&gt;...&lt;/p&gt;&lt;/code&gt; element. The button below is essentially the same, but provides functionality to decrement the counter. Now let&#39;s implement the fun stuff in Rust.&lt;/p&gt;
&lt;p&gt;First, let&#39;s add some new dependencies to the very top of the &lt;code&gt;src/main.rs&lt;/code&gt; file. This will come in handy when dealing with application state. &lt;/p&gt;
&lt;pre&gt;&lt;code&gt;use std::sync::Mutex
... rest of src/main.rs ...
&lt;/code&gt;&lt;/pre&gt;
&lt;p&gt;I can&#39;t go into detail about what &lt;code&gt;Mutex&lt;/code&gt; does. (Really, I do not have the ability nor the knowledge to explain all of the wonderful things it does.) Suffice to say, it enables thread-safe mutability of data, which means we can safely change the value across the entire application. &lt;/p&gt;
&lt;p&gt;Next, let&#39;s add a struct that will house the state of our application. Below the imports, add the struct.&lt;/p&gt;
&lt;pre&gt;&lt;code&gt;... rest of src/main.rs

struct AppStateCounter {

	counter: Mutex&lt;i32&gt;

}

... rest of src/main.rs ...
&lt;/code&gt;&lt;/pre&gt;
&lt;p&gt;This is simple enough. Inside of our struct, we define an element that includes our magical &lt;code&gt;Mutex&lt;/code&gt;, which stores an unsigned 32-bit integer.&lt;/p&gt;
&lt;p&gt;Remember our &lt;code&gt;templates/main.html&lt;/code&gt; has buttons that refer to a &lt;code&gt;/increment&lt;/code&gt; and &lt;code&gt;/decrement&lt;/code&gt; endpoint? The &lt;code&gt;/&lt;/code&gt; route also needs context from tera to display the initial counter value. Let&#39;s implement our new route handlers next.&lt;/p&gt;
&lt;pre&gt;&lt;code&gt;... rest of src/main.rs ...
#[get(&quot;/&quot;)]
async fn home(tera: Data&lt;Tera&gt;, data: Data&lt;AppStateCounter&gt;) -&gt; impl Responder {

	let counter = data.counter.lock().unwrap();

	let mut home_context = Context::new();

	home_context.insert(&quot;counter_value&quot;, &amp;*counter);

	HttpResponse::Ok().body(tera.render(&quot;main.html&quot;, &amp;home_context).unwrap())

}


#[get(&quot;/increment&quot;)]
async fn increment(tera: Data&lt;Tera&gt;, data: Data&lt;AppStateCounter&gt;) -&gt; impl Responder {

	let mut counter = data.counter.lock().unwrap();
	
	*counter += 1;
	
	log::info!(&quot;Incremented Counter Value: {}&quot;, *counter);
	
	let mut increment_context = Context::new();
	
	increment_context.insert(&quot;counter_value&quot;, &amp;*counter);
	
	HttpResponse::Ok().body(tera.render(&quot;components/counter.html&quot;, &amp;increment_context).unwrap())

}

  

#[get(&quot;/decrement&quot;)]
async fn decrement(tera: Data&lt;Tera&gt;, data: Data&lt;AppStateCounter&gt;) -&gt; impl Responder {

	let mut counter = data.counter.lock().unwrap();
	
	*counter -= 1;
	
	log::info!(&quot;Decremented Counter Value: {}&quot;, *counter);
	
	let mut decrement_context = Context::new();
	
	decrement_context.insert(&quot;counter_value&quot;, &amp;*counter);
	
	HttpResponse::Ok().body(tera.render(&quot;components/counter.html&quot;, &amp;decrement_context).unwrap())

}
... rest of src/main.rs ...
&lt;/code&gt;&lt;/pre&gt;
&lt;p&gt;These are relatively simple, but let&#39;s go through what they accomplish. In the function parameters, we&#39;ve added a new item that pulls in our &lt;code&gt;AppStateCounter&lt;/code&gt; for later use. In our &lt;code&gt;/&lt;/code&gt; route, we pull out the numeric value from this struct. After creating a new Tera &lt;code&gt;Context&lt;/code&gt;, we insert our counter value, and then pass it into our Tera rendering function in the response body. Our new &lt;code&gt;/increment&lt;/code&gt; and &lt;code&gt;/decrement&lt;/code&gt; route handlers are similar, pulling out the counter value and modifying it. &lt;/p&gt;
&lt;p&gt;Finally, we&#39;ll add our new Tera template by creating a new &lt;code&gt;templates/components&lt;/code&gt; directory, and adding a file called &lt;code&gt;templates/components/counter.html&lt;/code&gt;. This is what it should look like:&lt;/p&gt;
&lt;pre&gt;&lt;code&gt;&lt;p id=&quot;counter&quot;&gt;Counter: {{ counter_value }}&lt;/p&gt;
&lt;/code&gt;&lt;/pre&gt;
&lt;p&gt;Let&#39;s head back to our &lt;code&gt;src/main.rs&lt;/code&gt; file, since we need to add everything to the application implementation in our &lt;code&gt;main&lt;/code&gt; function. Replace the main function with the following implementation which incorporates our state into our actix-web application.&lt;/p&gt;
&lt;pre&gt;&lt;code&gt;... rest of src/main.rs ...
#[actix::main]
async fn main() -&gt; std::io::Result&lt;()&gt; {

	env_logger::init();
	
	log::debug!(&quot;Starting Server&quot;);
	
	  
	let tera = Data::new(Tera::new(&quot;./templates/**/*.html&quot;).unwrap());
	
	let counter = Data::new(AppStateCounter {
	
		counter: Mutex::new(0)
	
	});
	
	HttpServer::new( move || {
	
		App::new()
		
		.app_data(tera.clone())
		
		.app_data(counter.clone())
		
		.service(home)
		
		.service(increment)
		
		.service(decrement)
	
	})
	
	.bind((&quot;0.0.0.0&quot;, 8000))?
	
	.run()
	
	.await

}
&lt;/code&gt;&lt;/pre&gt;
&lt;p&gt;In this snippet, we added a counter implementation of our AppStateCounter struct, and passed it as application data into our application. We&#39;ve also added two new routes for &lt;code&gt;/increment&lt;/code&gt; and &lt;code&gt;/decrement&lt;/code&gt;.&lt;/p&gt;
&lt;p&gt;Finally, when we rerun our Rust application, we should be able to increment and decrement our counter! We&#39;ve used essential parts of actix-web, htmx, and tera to create this lightweight and performant web application. &lt;/p&gt;
&lt;h2&gt;Optional: Add TailwindCSS&lt;/h2&gt;
&lt;p&gt;I love tailwindcss because it makes styling a breeze. If you&#39;re wondering how to implement tailwind outside of a javascript framework, this section is for you.&lt;/p&gt;
&lt;p&gt; To initialize tailwindcss, use the following command, which requires Nodejs.&lt;/p&gt;
&lt;pre&gt;&lt;code&gt;npx tailwindcss init
&lt;/code&gt;&lt;/pre&gt;
&lt;p&gt;This will create a &lt;code&gt;tailwind.config.js&lt;/code&gt; file in your project directory. In this file, add our templates to the content. This will make sure that tailwindcss generates the utility classes when we add them into our html templates. Your tailwind.config.js should look like this:&lt;/p&gt;
&lt;pre&gt;&lt;code&gt;/** @type {import(&#39;tailwindcss&#39;).Config} */

module.exports = {

	content: [&quot;./templates/**/*.html&quot;],
	
	theme: {
	
		extend: {},
	
	},
	
	plugins: [],

}
&lt;/code&gt;&lt;/pre&gt;
&lt;p&gt;Let&#39;s add a new &lt;code&gt;/static&lt;/code&gt; directory to house or css files. Here, add a file called &lt;code&gt;/static/main.css&lt;/code&gt;. Add the following directives to load tailwind.&lt;/p&gt;
&lt;pre&gt;&lt;code&gt;@tailwind base;

@tailwind components;

@tailwind utilities;
&lt;/code&gt;&lt;/pre&gt;
&lt;p&gt;To generate our final css files, use the tailwind command.&lt;/p&gt;
&lt;pre&gt;&lt;code&gt;npx tailwindcss -i ./static/main.css -o ./static/tailwind.css
&lt;/code&gt;&lt;/pre&gt;
&lt;p&gt;Running this, it will generate &lt;code&gt;/static/tailwind.css&lt;/code&gt;.&lt;/p&gt;
&lt;p&gt;In our &lt;code&gt;src/main.rs&lt;/code&gt; file, add a new &lt;code&gt;.service()&lt;/code&gt; under our &lt;code&gt;.app_data()&lt;/code&gt; initializations. Your &lt;code&gt;main&lt;/code&gt; function should now look like this:&lt;/p&gt;
&lt;pre&gt;&lt;code&gt;... rest of src/main.rs ...

#[actix::main]

async fn main() -&gt; std::io::Result&lt;()&gt; {

	env_logger::init();
	
	log::debug!(&quot;Starting Server&quot;);
	
	  
	
	let tera = Data::new(Tera::new(&quot;./templates/**/*.html&quot;).unwrap());
	
	let counter = Data::new(AppStateCounter {
	
	counter: Mutex::new(0)
	
	});
	
	  
	
	HttpServer::new( move || {
	
		App::new()
		
		.app_data(tera.clone())
		
		.app_data(counter.clone())
		
		.service(actix_files::Files::new(&quot;/static&quot;, &quot;./static&quot;).show_files_listing()) // making /static files accessible to the client
		
		.service(home)
		
		.service(increment)
		
		.service(decrement)
	
	})
	
	.bind((&quot;0.0.0.0&quot;, 8000))?
	
	.run()
	
	.await

}
&lt;/code&gt;&lt;/pre&gt;
&lt;p&gt;Finally, alter your &lt;code&gt;templates/main.html&lt;/code&gt; to include a link to our &lt;code&gt;/static/tailwind.css&lt;/code&gt; file.&lt;/p&gt;
&lt;pre&gt;&lt;code&gt;&lt;!DOCTYPE html&gt;

&lt;head&gt;

	&lt;title&gt;actix htmx p5&lt;/title&gt;

	&lt;script src=&quot;https://unpkg.com/htmx.org@1.9.10&quot; integrity=&quot;sha384-D1Kt99CQMDuVetoL1lrYwg5t+9QdHe7NLX/SoJYkXDFfX37iInKRy5xLSi8nO7UC&quot; crossorigin=&quot;anonymous&quot;&gt;&lt;/script&gt;
	
	{# Adding a link to our tailwind.css #}
	&lt;link rel=&quot;stylesheet&quot; href=&quot;/static/tailwind.css&quot;/&gt;

&lt;/head&gt;

&lt;body&gt;

	&lt;h1 class=&quot;text-xl&quot;&gt;Actix Htmx&lt;/h1&gt;
	
	  
	
	&lt;p id=&quot;counter&quot;&gt;Counter: {{ counter_value }}&lt;/p&gt;
	
	  
	
	&lt;button hx-get=&quot;/increment&quot; hx-target=&quot;#counter&quot;&gt;increment&lt;/button&gt;
	
	  
	
	&lt;button hx-get=&quot;/decrement&quot; hx-target=&quot;#counter&quot;&gt;decrement&lt;/button&gt;

&lt;/body&gt;
&lt;/code&gt;&lt;/pre&gt;
&lt;p&gt;Now when you run the server, the &lt;code&gt;&lt;h1&gt;...&lt;/h1&gt;&lt;/code&gt; element should appear larger!&lt;/p&gt;
&lt;p&gt;You will have to rerun the tailwind output command every time you make a change, so I&#39;ve implemented the following build script in &lt;code&gt;build.rs&lt;/code&gt;:&lt;/p&gt;
&lt;pre&gt;&lt;code&gt;use std::process::Command;

  

fn main() {

let _ = Command::new(&quot;npx&quot;)

	.arg(&quot;tailwindcss&quot;)
	
	.arg(&quot;-i&quot;)
	
	.arg(&quot;./static/main.css&quot;)
	
	.arg(&quot;-o&quot;)
	
	.arg(&quot;./static/tailwind.css&quot;)
	
	.output();

}
&lt;/code&gt;&lt;/pre&gt;
&lt;p&gt;Simply put, everything the application is built, I use a process to run the tailwind output command, so I don&#39;t have to do it manually. Make sure to reference the build script in your &lt;code&gt;Cargo.toml&lt;/code&gt; by adding &lt;code&gt;build = &quot;build.rs&quot;&lt;/code&gt; under the &lt;code&gt;edition&lt;/code&gt; line in the &lt;code&gt;[package]&lt;/code&gt; section.&lt;/p&gt;
&lt;h2&gt;Conclusion&lt;/h2&gt;
&lt;p&gt;I&#39;ve had a lot of fun learning htmx, and paring back my web applications with Rust. I built a whole web user interface around &lt;a href=&quot;https://ollama.ai/&quot;&gt;ollama&lt;/a&gt;that&#39;s hosted publicly &lt;a href=&quot;https://ai.moonstripe.com&quot;&gt;here&lt;/a&gt; using the stack I&#39;ve just talked about. Go on and make cool things using web technology as they were intended to be used!&lt;/p&gt;
</description>
      <pubDate>Fri Jan 05 2024 10:18:32 GMT-0800 (Pacific Standard Time)</pubDate>
    </item>
    <item>
      <title>Project Gravity</title>
      <link>https://kojinglick.com/blog/project-gravity</link>
      <description>&lt;h2&gt;Introducing Project Gravity&lt;/h2&gt;
&lt;p&gt;&lt;a href='https://projectgravity.io' target=&quot;_blank&quot; class=&quot;text-black dark:text-neutral&quot;&gt;Project Gravity&lt;/a&gt; hosts &lt;a href='https://projectgravity.io/chat' target=&quot;_blank&quot; class=&quot;text-black dark:text-neutral&quot;&gt;SoftLandingGPT&lt;/a&gt;, an AI-enabled librarian for resources aimed at empowering bystanders to engage with people who may be falling down paths towards radicalization. The goal is to reach out to service providers like school counselors and community leaders to help connect them to the resources available in the countering targeted violence space.&lt;/p&gt;
&lt;h2&gt;Components&lt;/h2&gt;
&lt;p&gt;As Tech Lead, I was in charge of implementing everything. From the interface with OpenAI&#39;s API, to &lt;a href='https://huggingface.co/moonstripe/hate_speech_classification_v1' target=&quot;_blank&quot; class=&quot;text-black dark:text-neutral&quot;&gt;building custom moderation models on huggingface&lt;/a&gt;, this is definitely my most ambitious web-hosted software project yet. I wanted to go over the major features I&#39;ve implemented in the last month.&lt;/p&gt;
&lt;h3&gt;AI-enabled, natural language directory&lt;/h3&gt;
&lt;p&gt;The centerpiece of &lt;a href='https://projectgravity.io' target=&quot;_blank&quot; class=&quot;text-black dark:text-neutral&quot;&gt;Project Gravity&lt;/a&gt; is our &lt;a href='https://projectgravity.io/chat' target=&quot;_blank&quot; class=&quot;text-black dark:text-neutral&quot;&gt;SoftLandignGPT&lt;/a&gt; tool. Presented like an online chat, it pulls together resources we&#39;ve found that might help equip a person who is unsure of how to engage with a loved one who is falling down a conspiracy rabbit hole. It uses websockets to connect with our backend, which interfaces with OpenAI&#39;s API. The conversation is preceded by several system prompts that give GPT-3.5 a more defined role, and equip it with research and resources that are pulled together by our team.&lt;/p&gt;
&lt;h3&gt;Custom hate speech classifier&lt;/h3&gt;
&lt;p&gt;After testing OpenAI&#39;s moderation endpoint, it became clear that we had more narrow requirements for a hate speech classifier. This triggers our &quot;traffic light&quot; system, in which we flag conversations as green, yellow, or red for internal review. So I trained a BERT model for four days on the &lt;a href='https://huggingface.co/datasets/ucberkeley-dlab/measuring-hate-speech' target=&quot;_blank&quot; class=&quot;text-black dark:text-neutral&quot;&gt;measuring hate dataset provided by UC Berkeley&#39;s D-lab&lt;/a&gt;, to create &lt;a href='https://huggingface.co/moonstripe/hate_speech_classification_v1' target=&quot;_blank&quot; class=&quot;text-black dark:text-neutral&quot;&gt;our custom model&lt;/a&gt;. This was the first model I&#39;ve ever trained from scratch and deployed on huggingface. We hope that by providing it with an open license for use, we demonstrate how committed we are to pushing the envelope for developing tools that can be used to counter targeted violence.&lt;/p&gt;
&lt;h3&gt;Content management by Sanity&lt;/h3&gt;
&lt;p&gt;Once again, &lt;a href='https://www.sanity.io/' target=&quot;_blank&quot; class=&quot;text-black dark:text-neutral&quot;&gt;Sanity&lt;/a&gt; comes to the rescue. Everything from our &lt;a href='https://projectgravity.io/blog' target=&quot;_blank&quot; class=&quot;text-black dark:text-neutral&quot;&gt;blog&lt;/a&gt;, to our &lt;a href='https://projectgravity.io/research' target=&quot;_blank&quot; class=&quot;text-black dark:text-neutral&quot;&gt;research citations&lt;/a&gt; page, to the deployed resources, is hosted on Sanity, which allows for the whole team to get involved without pinging me everytime they find a new resource. It&#39;s incredible uptime, quick response, and incredible pricing plan made it a no brainer to integrate into the project. &lt;strong&gt;Thank you, Sanity!&lt;/strong&gt;&lt;/p&gt;
&lt;h3&gt;Admin Dashboard&lt;/h3&gt;
&lt;p&gt;Though definitely a long-term goal, we want to aggregate important natural language trends from our conversations without breaching our privacy priority. With Google-based OAuth, team members can glean essential top level statistics and get key summaries from the database without messing with the database itself.&lt;/p&gt;
&lt;h3&gt;Custom link tracking&lt;/h3&gt;
&lt;p&gt;Suggested to us by our friends at the McCain Institute, I implemented a custom link passthrough service that counts all of the outbound links provided by our tool. This will help us spin up a business model, and demonstrate our value to our partners through hard data.&lt;/p&gt;
&lt;h2&gt;Try it out and tell me what you think!&lt;/h2&gt;
&lt;p&gt;I&#39;ve had a blast implementing everything, but this isn&#39;t just a portfolio project. This is something that we think really has legs, and might turn into something much bigger than an Invent2Prevent competition entry. I&#39;m honored to have had an integral part in building &lt;a href='https://projectgravity.io' target=&quot;_blank&quot; class=&quot;text-black dark:text-neutral&quot;&gt;Project Gravity&lt;/a&gt;, and I hope you see the vision, too.&lt;/p&gt;
</description>
      <pubDate>Mon Oct 23 2023 18:05:08 GMT-0700 (Pacific Daylight Time)</pubDate>
    </item>
    <item>
      <title>Man-in-the-middle for good?</title>
      <link>https://kojinglick.com/blog/man-in-the-middle-for-good</link>
      <description>&lt;p&gt;As part of the Invent2Prevent competition, which is a Department of Homeland Security sponsored startup competition with the objective of creating counter-violent-extremism programs or tools, I had the following thought: what if you could set up an HTTPS proxy on youth-owned devices to identify possible extremist content so that parents could broach the conversation before the user got too deep in extremist content?&lt;/p&gt;
&lt;p&gt;Obviously, this is a potentially dangerous line of thinking. Designed poorly, this allows parents unfettered access over their child’s web traffic, enabling parents to enact draconian limitations over their child’s growing interests. &lt;/p&gt;
&lt;p&gt;But designed well, this could be a tool that offers a concerned parents, or school counselors a canary in the dark mines of digital extremist recruitment. &lt;/p&gt;
&lt;p&gt;Rather than simply forward web traffic by pushing urls to be checked against some web-hosted database, a more sensitive solution would be to pull a url map of links associated with extremism. And by showing parents only deeply concerning patterns in their child’s consumption of digital media, it would prevent parents from using the tool repressively. &lt;/p&gt;
&lt;p&gt;At the end of the day, the goal of the tool would be to enable genuine conversations between potential extremist recruits and their immediate, trusted and offline social network. But much more planning and clever design will be needed to ensure that this doesn’t simply enter the history books as another repressive internet tool or a back door for nefarious actors. &lt;/p&gt;
&lt;p&gt;Lots to think about. &lt;/p&gt;
</description>
      <pubDate>Tue Sep 19 2023 19:19:01 GMT-0700 (Pacific Daylight Time)</pubDate>
    </item>
    <item>
      <title>CTEC Progress</title>
      <link>https://kojinglick.com/blog/ctec-progress</link>
      <description>&lt;p&gt;It&#39;s been a while since I&#39;ve updated my blog here, and since I&#39;ve been primarily busy with establishing myself at &lt;a href=&quot;https://www.middlebury.edu/institute/academics/centers-initiatives/ctec&quot;&gt;CTEC&lt;/a&gt;, I&#39;ll cover my progress there in some detail. We&#39;ll cover some of the reports I&#39;ve finished up, and projects that are just starting to come into full swing as I start my full-time engagement with the center this summer.&lt;/p&gt;
&lt;h2&gt;Wagner Reports&lt;/h2&gt;
&lt;p&gt;Probably my biggest contribution so far has been the publication of three reports about the Wagner Group. They&#39;ve been a lot of fun to write, and I&#39;ve been able to flex all manners of methodological muscles and conquer all sorts of theoretical challenges. In addition to these reports, I was able to contribute to the &lt;a href=&quot;https://www.wsj.com/video/series/shadow-men/shadow-men-inside-wagner-russias-secret-war-company/29735C37-0B4E-4E70-8E8C-C46FB711370C?mod=djem10point&quot;&gt;Wall Street Journal&#39;s documentary about the Wagner Group&lt;/a&gt;, and CTEC earned a place in the acknowledgements!&lt;/p&gt;
&lt;h3&gt;The Changing Face of the Wagner Group: From Military Adventurism to Venture Capitalism&lt;/h3&gt;
&lt;p&gt;The &lt;a href=&quot;https://www.middlebury.edu/institute/academics/centers-initiatives/ctec/ctec-publications/changing-face-wagner-group-military&quot;&gt;first one&lt;/a&gt;, was largely my co-author&#39;s work, and it regards the financial empire of Yevgeniy Prigozhin. I was able to contribute editorial skills, as well as an open-source investigation into the hackathon that the Wagner Group undertook at their headquarters in St. Petersburg. This paper really got me thinking more deeply about the changing face of conflict, as nation-states are more restricted than ever in claiming territories and other exercises in strategic flexibility. The Wagner Group seems less like a private military company, and more like a strategic multi-tool for Russian foreign interests.&lt;/p&gt;
&lt;h3&gt;The Wagner Group’s Social Footprint: A Time-Series Sentiment Analysis of PMC World&lt;/h3&gt;
&lt;p&gt;I had a lot of fun writing and researching &lt;a href=&quot;https://www.middlebury.edu/institute/academics/centers-initiatives/ctec/ctec-publications/wagner-groups-social-footprint-time-series&quot;&gt;this one&lt;/a&gt;. I built a custom scraping tool for VK, and developed brand new proficiencies in Natural Language Processing, in Russian no less. By examining the sentiment of all engagements, both posts and comments, in the PMCWorld VK channel, I was able to find a distinctly, and statistically significant, positive period between Putin&#39;s speech calling Ukraine rightful Russian territory and the expansion of Russian aggression towards Ukraine. Not only this, text frequency analysis during this period revealed a lot of chatter involving the word &quot;movie,&quot; which lead me to the discovery that Prigozhin had purchased a movie studio in order to release a propagandistic movie presenting the 2014 initiation of hostilities as victimizing for Russian speakers. &lt;/p&gt;
&lt;p&gt;Despite my excitement, my challenge here was presenting these findings as descriptive rather than causal, and although I wanted to point to these discoveries as revolutionary evidence that Prigozhin was acting as a state propagandist drumming up support for the later invasion, I think I did well in holding this more bombastic statements back. This report was very helpful in teaching me how to present statistics-based discoveries in an even-handed matter. As Mark Twain said, &lt;a href=&quot;https://en.wikipedia.org/wiki/Lies,_damned_lies,_and_statistics&quot;&gt;&quot;there are lies, damned lies, and statistics&quot;&lt;/a&gt;.&lt;/p&gt;
&lt;h3&gt;From Ideas to Action: Yevgeniy Prigozhin, Wagner Group, and the Operationalization of Duginism&lt;/h3&gt;
&lt;p&gt;&lt;a href=&quot;https://www.middlebury.edu/institute/academics/centers-initiatives/ctec/ctec-publications/ideas-action-yevgeniy-prigozhin-wagner-group&quot;&gt;This report&lt;/a&gt; was a lot more up my alley, at least in terms of what I&#39;ve studied in my undergraduate career as a political scientist and media theorist. It&#39;s a rather in-depth analysis of Duginist thought, as well as brilliant open-source work by my colleague and myself in connecting Dugin and Prigozhin closely beyond theory and into the real world. Not much to say here, but definitely learned to appreciate the editorial work by the other full-time staff at CTEC who sought to make my work sound more like an academic work than a diatribe against &quot;Russian thought.&quot; As they pointed out well, Russia is a large and multi-cultural place, and ascribing the thought of a particular esoteric thinker to the entire country reduces the credibility of my statements.&lt;/p&gt;
&lt;h2&gt;New projects&lt;/h2&gt;
&lt;p&gt;The project I&#39;m engaging with this summer is probably my most ambitious software project as of yet. It&#39;s a fully native (and hopefully cross-platform) application that helps researchers conduct analysis of a particular social media platform. Luckily, there&#39;s well-founded reason why I think I can be successful despite being a one man team on a project that would usually take an entire software development studio. &lt;a href=&quot;https://tauri.app/&quot;&gt;Tauri&lt;/a&gt; is an awesome framework with a frontend written in web-friendly languages like Typescript and HTML, and a powerful backend written in Rust. I&#39;ve chosen &lt;a href=&quot;https://svelte.dev/&quot;&gt;Svelte&lt;/a&gt; as my frontend framework, which is an awesome framework to sketch out big ideas with little to know boilerplate. As a side note, the ability to write components with the following syntax makes my heart happy:&lt;/p&gt;
&lt;pre&gt;&lt;code&gt;    &lt;script&gt;
        &lt;!-- insert typescript for component logic here --&gt;
    &lt;/script&gt;

    &lt;div&gt;
        &lt;!-- insert plain html here --&gt;
    &lt;/div&gt;

    &lt;styles&gt;
        &lt;!-- optionally add styling here, but mostly I use tailwindcss --&gt;
    &lt;/styles&gt;
&lt;/code&gt;&lt;/pre&gt;
&lt;p&gt;I&#39;ve really made strides in my abilities and confidence as a Rust developer, and I&#39;ve been able to bring ideas to life in Rust with speed and effectiveness. Also, this project features my most extensive attempt at project management and accountability, so luckily I have that cut out for me.&lt;/p&gt;
&lt;h2&gt;Wrapping up&lt;/h2&gt;
&lt;p&gt;I&#39;ve been blessed to find a place to apply myself and learn so much in such a self-directed manner. The best part is that despite my personal development, I don&#39;t feel like I&#39;m being selfish for applying my own perspective to solving problems, and my initiative has been treated as a plus and not as a minor annoyance. Hopefully I come back to this blog with good news about my projects.&lt;/p&gt;
</description>
      <pubDate>Fri Jun 09 2023 18:09:35 GMT-0700 (Pacific Daylight Time)</pubDate>
    </item>
    <item>
      <title>LastPass, and the Foundation of Trust</title>
      <link>https://kojinglick.com/blog/lastpass</link>
      <description>&lt;p&gt;So, &lt;a href=&quot;https://www.techtarget.com/searchsecurity/news/252529329/LastPass-faces-mounting-criticism-over-recent-breach&quot;&gt;LastPass has famously dropped the ball&lt;/a&gt;. Late December 2022, tucked into the holiday season, LastPass announced that &lt;a href=&quot;https://blog.lastpass.com/2022/12/notice-of-recent-security-incident/&quot;&gt;threat actors have obtained access&lt;/a&gt; to backups of encrypted vaults, and can attempt to brute force your master password to get all of the juicy secrets inside. More recently, the full extent of the breach is coming to light. Although the technical details are integral to making decisions, the communications by LastPass has been completely taken over by legal. &lt;a href=&quot;https://www.lastpass.com/company/newsroom&quot;&gt;The LastPass communications team has not released a blog since November&lt;/a&gt;, deeply entrenching the emerging tensions between LastPass&#39;s lawyers and the enterprise and personal customers (like me), who were affected. Password managers are a product whose value propositions are tied directly to trust and mutual understanding, and the current debate underscores the issues that arise when an open source community interacts with a closed source corporate bureaucracy.&lt;/p&gt;
&lt;p&gt;As the &lt;a href=&quot;https://opensourcesecurity.io/2023/01/01/episode-356-lastpass-ducked-up-now-what/&quot;&gt;Open Source Security Podcast&lt;/a&gt; does a good job in highlighting, the glaring issue is that LastPass&#39;s current communications strategy is actively confusing to their most important customers. When any security product fails, the response is going to be determined by the nature of the breach. Building clear next steps relies on clear communication between the technical teams of both sides. In this case, developers responsible for rotating API keys and updating credentials aren&#39;t even clear about what exactly the breach even was. Does the threat actor have access to vault metadata? If so, what did &quot;encrypted vault&quot; ever really mean? From what I can tell, a major contributor to the general sense of confusion is that developers have lost faith that the words used by LastPass in their promotional content and their crisis response are the same words that they use in their day to day life. &lt;/p&gt;
&lt;p&gt;Trust relies on clear and open communication. The sense is that LastPass is prioritizing mitigating backlash by mediating the conversation that needs to happen about the consequences of the breach. In other words, LastPass has entered a vicious cycle of lying to their most affected customers, and rather than rebuilding developer confidence, they&#39;ve turned their most valuable customers into their most outspken critics.&lt;/p&gt;
</description>
      <pubDate>Sat Jan 21 2023 18:33:53 GMT-0800 (Pacific Standard Time)</pubDate>
    </item>
    <item>
      <title>Happy New Year!</title>
      <link>https://kojinglick.com/blog/happy-new-year</link>
      <description>&lt;p&gt;It&#39;s been a while, since the end of last semester was a bit of a doozy. I just wanted to recap my 2022 and forecast my 2023 here in a few bullet points:&lt;/p&gt;
&lt;h2&gt;2022 Accomplishments&lt;/h2&gt;
&lt;p&gt;I started my grad program in Non-Proliferation and Terrorism Studies at the Middlebury Institute of International Studies at Monterey. It&#39;s been a fantastic year, with many fond memories to look back on: &lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;I trialled the &lt;a href=&quot;https://www.middlebury.edu/institute/academics/centers-initiatives/meta-lab&quot;&gt;META Lab&lt;/a&gt; over the summer, and contributed to their &lt;a href=&quot;https://www.middlebury.edu/institute/events/mirror-gap-analysis-presentation-cipe-02-24-2022&quot;&gt;Mirror Gap Analysis&lt;/a&gt; project. I ended up making a front-end and an API which went unused due to lack of organization. I also added a Course feature to their Wordpress, which ended up taking years of my life from me when the page went down hours before a meeting with the Provost. Between that and the overall disorganization, I don&#39;t really see myself at META Lab for an extended period of time. Factors relating to the personnel, who all joined as friends years ago, also contributed to the lack of clarity in mission or task.&lt;ul&gt;
&lt;li&gt;Takeaway: I hate Wordpress and PHP. Even student organizations need to engage in rigorous operations procedures, even if you&#39;re all friends.&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;I ended up at &lt;a href=&quot;https://www.middlebury.edu/institute/academics/centers-initiatives/ctec&quot;&gt;CTEC&lt;/a&gt;, whose mission and goals are much more clear than the META Lab, and has been doing significant work in the field. I joined Jason Blazakis&#39; project, and in the coming weeks, it&#39;ll be published!! That was a whole process, but I think I learned how to work more smoothly with others and definitely how to work better with distributed systems. I have also been assigned to a couple other projects at CTEC where my role is a mixture of developer, system admin, and other techy positions we don&#39;t have the personnel to fill.&lt;ul&gt;
&lt;li&gt;Takeaway: Organizations with direction and focus actually offer entry-level positions more impact, whereas flat organizations need to be deliberate if they want to get anything done. Also, &lt;code&gt;nohup&lt;/code&gt; rocks.&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;My Japanese skills have improved significantly after two semesters of being forced to do the Intermediate-Advanced and Advanced Japanese courses. Even my ability to throw together thoughts in writing has improved. Very excited to see more development.&lt;ul&gt;
&lt;li&gt;Takeaway: I&#39;m still illiterate, but boy can I convince people I&#39;m good at Japanese.&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;When it came to &lt;a href=&quot;'https://www.moonstripe.com/&quot;&gt;Moonstripe Design&lt;/a&gt;: 2022 was a big year:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;I wrapped up a contract with &lt;a href=&quot;https://www.privacycode.ai/&quot;&gt;PrivacyCode&lt;/a&gt;, adding their slider and blog systems.&lt;/li&gt;
&lt;li&gt;I just finished a contract with &lt;a href=&quot;https://www.navablaw.com/&quot;&gt;Navab Law&lt;/a&gt;, adding their digital presence to the internet, and also engineering an email forwarder to ease client intake.&lt;/li&gt;
&lt;li&gt;I also built the &lt;a href=&quot;https://opencryptomap.deno.dev/&quot;&gt;Open Ethereum Map&lt;/a&gt; and learned D3.&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;Socially, I&#39;ve had a lot of fun. From backpacking trips, to hikes with grad school colleagues, I think I&#39;ve really been able to start growing my social safety net for the first time since 2019.&lt;/p&gt;
&lt;h2&gt;2023 Goals&lt;/h2&gt;
&lt;p&gt;My goal of attending MIIS was to finally get published, and it&#39;s looking like a reality with CTEC. That being said, there are many more objectives this year to attain as well.&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;I want to see the CTEC paper all the way through.&lt;/li&gt;
&lt;li&gt;Start a portfolio of research and expand on my research.&lt;/li&gt;
&lt;li&gt;Build my knowledge of programming, and continue to develop devOps skills.&lt;/li&gt;
&lt;li&gt;Finally utilize my Pi Cloud for something.&lt;/li&gt;
&lt;li&gt;Go backpacking more.&lt;/li&gt;
&lt;li&gt;Work on my posture.&lt;/li&gt;
&lt;li&gt;Wake up earlier.&lt;/li&gt;
&lt;li&gt;Cook more.&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;Otherwise, I&#39;m just going to continue to be me. Starting with a trip to see friends down South.&lt;/p&gt;
&lt;p&gt;Can&#39;t wait to see what 2023 brings!&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Best,&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Kojin&lt;/p&gt;
&lt;/li&gt;
&lt;/ul&gt;
</description>
      <pubDate>Tue Jan 03 2023 14:18:22 GMT-0800 (Pacific Standard Time)</pubDate>
    </item>
    <item>
      <title>Big changes</title>
      <link>https://kojinglick.com/blog/changes</link>
      <description>&lt;p&gt;Deno just released a &lt;a href=&quot;https://deno.com/blog/changes&quot; target=&quot;_blank&quot; rel=&quot;noopener noreferrer&quot;&gt;fascinating  blog article&lt;/a&gt; announcing new changes. There are two points that seemed extra interesting to me. First, looks like there&#39;s going to be a new HTTP server in Deno. Whoo.&lt;/p&gt;
&lt;p&gt;More importantly, they&#39;ve announced interoperability with npm packages in the next three months. WITHOUT &lt;code&gt;npm install&lt;/code&gt; or /node_modules. Let&#39;s go!&lt;/p&gt;
&lt;p&gt;I get that most people want Deno without any interface with Node, but the reality is that for the sake of critical mass adoption, Deno needs to be making all of the right decisions, even if it means accepting some old bad decisions. Mitigating the effect of Node overflow will look like exactly what they&#39;ve promised: a /node_modules free life. &lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;There will be no node_modules folder, no npm install; 
the packages will be automatically downloaded in the Deno cache. &lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;a href=&quot;https://deno.com/blog/changes&quot; target=&quot;_blank&quot; rel=&quot;noopener noreferrer&quot;&gt;Deno&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;/blockquote&gt;
&lt;p&gt;Still, from this it&#39;s not super clear whether we&#39;ll need a package.json, though. I don&#39;t see why, if they&#39;ll be cached. &lt;/p&gt;
</description>
      <pubDate>Sun Aug 21 2022 22:38:04 GMT-0700 (Pacific Daylight Time)</pubDate>
    </item>
    <item>
      <title>A Study in Threejs, Fresh and Product Pages</title>
      <link>https://kojinglick.com/blog/product-pages</link>
      <description>&lt;h2&gt;But why, though?&lt;/h2&gt;
&lt;p&gt;Because I can. Next question.&lt;/p&gt;
&lt;h2&gt;Let me see it.&lt;/h2&gt;
&lt;p&gt;Ok, here you go.&lt;/p&gt;
&lt;iframe id=&quot;cubeframe&quot;
    title=&quot;Product page for a Retro Console&quot;
    width=&quot;100%&quot;
    height=&quot;500px&quot;
    src=&quot;https://kojinglick.com/products/cube&quot;&gt;
&lt;/iframe&gt;
&lt;a href=&quot;https://kojinglick.com/products/cube&quot; target=&quot;_blank&quot; rel=&quot;noopener noreferrer&quot;&gt;See it for your self!&lt;/a&gt;

&lt;h2&gt;Elaborate.&lt;/h2&gt;
&lt;p&gt;Fine. Fresh&#39;s real sell is the ability to churn out solid web development that makes use of reactivity, while having more the DX of making static HTML websites. It&#39;s raw, it&#39;s unstructured, it&#39;s fast. My obvious first instinct is to see how far I can push the solid web development stuff while having a developing experience like throwing pasta at a wall and seeing what sticks. A great stress test is the upper end of client asks for static product pages: 3D graphics!&lt;/p&gt;
&lt;h2&gt;Tutorial?&lt;/h2&gt;
&lt;p&gt;Coming soon.&lt;/p&gt;
</description>
      <pubDate>Sun Aug 07 2022 20:49:32 GMT-0700 (Pacific Daylight Time)</pubDate>
    </item>
    <item>
      <title>Jammin' Out With Xerf</title>
      <link>https://kojinglick.com/blog/jammin</link>
      <description>&lt;p&gt;I just need to express my love an appreciation for &lt;a href=&quot;https://www.youtube.com/c/TheXMusicArchive&quot; target=&quot;_blank&quot; rel=&quot;noopener noreferrer&quot;&gt;Xearching for Sounds&lt;/a&gt;, an amazing music youtube channel.&lt;/p&gt;
&lt;p&gt;A lot of the music is from the &#39;70s through the &#39;00s in Japan, and it has that jazzy, funky, pop-sound of the early internet era. There&#39;s something earnest in the music&#39;s glossy production and  exuberant optimism in that era. Countless hours have been spent jammin&#39; out to his tunes and coding. Some of the best programming music there is.&lt;/p&gt;
</description>
      <pubDate>Thu Aug 04 2022 22:32:32 GMT-0700 (Pacific Daylight Time)</pubDate>
    </item>
    <item>
      <title>Mom's Art Needs a New Home</title>
      <link>https://kojinglick.com/blog/moms-art</link>
      <description>&lt;p&gt;Mom&#39;s art needed a new home after the scammy build-a-site company, Virb went kaput. &lt;/p&gt;
&lt;p&gt;&lt;a href=&quot;https://www.miwakonishizawa.art/&quot;&gt;Miwako Nishawa: Artist&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;I really like the SPA design. SPA&#39;s only really work if the content is essentially static, and the purpose of the website is simple. The perfect storm for my mom&#39;s woodblock career. (Also got a scroll spy working!)&lt;/p&gt;
</description>
      <pubDate>Thu Aug 04 2022 21:52:32 GMT-0700 (Pacific Daylight Time)</pubDate>
    </item>
    <item>
      <title>Mapping Crypto: Part 2</title>
      <link>https://kojinglick.com/blog/mapping-crypto-ii</link>
      <description>&lt;h2&gt;Something a little better&lt;/h2&gt;
&lt;p&gt;Good news, everyone! (Futurama, anyone?) &lt;a href=&quot;https://opencryptomap.deno.dev/&quot; target=&quot;_blank&quot; rel=&quot;noopener noreferrer&quot;&gt;Crypto-Map: Ethereum&lt;/a&gt; is now live!&lt;/p&gt;
&lt;iframe id=&quot;opencryptomapframe&quot;
    class=&quot;dark:text-green-400&quot;
    title=&quot;Website shows a lattice of interactions on the latest Ethereum blockWebsite shows a lattice of interactions on the latest Ethereum block&quot;
    width=&quot;100%&quot;
    height=&quot;500px&quot;
    src=&quot;https://opencryptomap.deno.dev/15292713&quot;&gt;
&lt;/iframe&gt;

&lt;p&gt;Initial features include: ability to hover over a node and see the wallet address, and hover over an edge and see the amount sent. You can click and drag the network apart, and it&#39;s kind of fun!&lt;/p&gt;
&lt;p&gt;In order to get relatively live data on the blockchain, I&#39;m using &lt;a href=&quot;https://www.alchemy.com/&quot; target=&quot;_blank&quot; rel=&quot;noopener noreferrer&quot;&gt;Alchemy&lt;/a&gt;&#39;s JSON-RPC enabled web-socket service.&lt;/p&gt;
&lt;p&gt;Also, I gave up on using Rust for the initial build. If this gains traction and I need 1,000,000 concurrent users, I&#39;ll readdress the languages question. Or just learn &lt;a href=&quot;https://elixir-lang.org/&quot; target=&quot;_blank&quot; rel=&quot;noopener noreferrer&quot;&gt;Elixir&lt;/a&gt;. Good grief.&lt;/p&gt;
&lt;h2&gt;Next Steps&lt;/h2&gt;
&lt;h3&gt;Making the UI better&lt;/h3&gt;
&lt;p&gt;It&#39;s nice to use d3, but under the hood, everything is spaghetti-code and weird transpilation of vanilla JS &gt; Node.js &gt; Typescript &gt; Deno, so I need to make it a little more coherent.&lt;/p&gt;
&lt;h3&gt;Not just the current block&lt;/h3&gt;
&lt;p&gt;Best case scenario, we constantly update a back-end database when a block is finalized. This way, users of the map could access historical data, as well as explore the current block.&lt;/p&gt;
&lt;h3&gt;Actual realtime, not REST-lite&lt;/h3&gt;
&lt;p&gt;I may have said &quot;web-socket&quot; earlier, but I&#39;m essentially using it like a REST-ful API service, where I just ask the alchemy endpoint for data on page load. Next, I want a system that subscribes to real-time additions to the block, and then animate the svg as it populates. Pie-in-the-sky stuff here.&lt;/p&gt;
&lt;h2&gt;To bigger and better maps&lt;/h2&gt;
&lt;p&gt;I can&#39;t wait to bring this to next semester&#39;s De-Fi and Web3 class.&lt;/p&gt;
</description>
      <pubDate>Tue Aug 02 2022 19:23:32 GMT-0700 (Pacific Daylight Time)</pubDate>
    </item>
    <item>
      <title>Mapping Crypto: Part 1</title>
      <link>https://kojinglick.com/blog/mapping-crypto-i</link>
      <description>&lt;h2&gt;A Long Time Coming&lt;/h2&gt;
&lt;p&gt;I&#39;ve been poking at this project for a couple of months now. It started with a study of the &lt;a href=&quot;https://docs.rs/petgraph/latest/petgraph/&quot; target=&quot;_blank&quot; rel=&quot;noopener noreferrer&quot;&gt;petgraph&lt;/a&gt; crate to try to build a backend storage/graph builder service. A month or so later, I learned about &lt;a href=&quot;https://developers.cloudflare.com/web3/ethereum-gateway/&quot; target=&quot;_blank&quot; rel=&quot;noopener noreferrer&quot;&gt;Cloudflare&#39;s Ethereum Gateway&lt;/a&gt;. Since I have a couple of websites already using Cloudflare anti-DDoS services, I just attached a gateway to one of my domains. Finally, I got off my butt today to try to make at least &lt;em&gt;something&lt;/em&gt; that works.&lt;/p&gt;
&lt;h2&gt;My Embarrasing Tangle of Tech&lt;/h2&gt;
&lt;h3&gt;Oak: a server engine for Deno&lt;/h3&gt;
&lt;p&gt;I&#39;m. Learning. Deno. I don&#39;t know why I&#39;m now insisting I do everything in Deno and Typescript, but so be it. Quickly ran through the Oak docs, and built a REST API relay for the ethereum gateway. This is an entirely unnecessary step which can be replaced by something in &lt;a href=&quot;https://actix.rs/&quot; target=&quot;_blank&quot; rel=&quot;noopener noreferrer&quot;&gt;Actix Web&lt;/a&gt; or something. Right now, it gets the current block number, retrieves the transactions from the block and serializes the information into json for Rust.&lt;/p&gt;
&lt;h3&gt;Petgraph Rust: building the graph&lt;/h3&gt;
&lt;p&gt;My main.rs is very simple. One function reads the json and deserializes it into Rust structs. Another turns those Rust structs into a node-based graph.&lt;/p&gt;
&lt;h3&gt;Graphviz Dot Utility: gui is just svg&lt;/h3&gt;
&lt;p&gt;The laziest step is using Graphviz&#39;s &lt;a href=&quot;https://graphviz.org/doc/info/command.html&quot; target=&quot;_blank&quot; rel=&quot;noopener noreferrer&quot;&gt;Dot&lt;/a&gt; Utility to generate the graph.&lt;/p&gt;
&lt;p&gt;This is what it looks like.&lt;/p&gt;
&lt;p&gt;&lt;img src=&quot;/static/img/crypto_map.png&quot; alt=&quot;A map of wallet addresses&quot;&gt;&lt;/p&gt;
&lt;p&gt;But, it exists. Next steps are replacing the Oak demo with a Rust service, and then using Typescript and Fresh or React for the front-end.&lt;/p&gt;
</description>
      <pubDate>Mon Aug 01 2022 19:23:32 GMT-0700 (Pacific Daylight Time)</pubDate>
    </item>
    <item>
      <title>Into Desolation Wilderness</title>
      <link>https://kojinglick.com/blog/into-desolation-wilderness</link>
      <description>&lt;h2&gt;July 29th&lt;/h2&gt;
&lt;p&gt;Meet Toby, then Stuart and Chloe in Berkeley, and then start driving towards Desolation around 18:30.&lt;/p&gt;
&lt;p&gt;Stop on the way to meet Ari and Mari, get (far too many) groceries and eat In-N-Out.&lt;/p&gt;
&lt;p&gt;Arrive at the Tell&#39;s Creek Trailhead/Equestrian Camp at 23:05.&lt;/p&gt;
&lt;p&gt;Enjoy some late night gaffs, and then to bed around 1:30.&lt;/p&gt;
&lt;pre class=&quot;plaintext&quot;&gt;&lt;code&gt;Protip: Make sure to be aware of nearby horse droppings
when camping at the trailhead parking lot.&lt;/code&gt;&lt;/pre&gt;

&lt;h2&gt;July 30th&lt;/h2&gt;
&lt;p&gt;Wake up at 8:00.&lt;/p&gt;
&lt;p&gt;Coffee, breakfast and packing before 10:30, hit the trail at 10:45.&lt;/p&gt;
&lt;p&gt;The plan is to go from Tell&#39;s Creek Equestrian Camp to Lake No. 3.&lt;/p&gt;
&lt;p&gt;&lt;img src=&quot;/static/img/0722desolation1.jpeg&quot; alt=&quot;Hikers on a trail walk by a meadow full of wildflowers.&quot;&gt;&lt;/p&gt;
&lt;p&gt;Leaving the trailhead, the path was mostly flat and gravelled, accessible to vehicles. We passed a large meadow to our left, across which we could see Van Vleck Bunkhouse. We continued until we reached our first major fork towards the right, up Red Peak Trail. The path continued to get narrower as it turned into the forest, occasionally poking out into lush meadows. By the second mile, the tire ruts in the path were gone, and the path was only wide enough for non-motorized traffic. Just before the second third of the hike, the path drives a meandering trail through thick evergreen thickets, some of which seem to be very young. Many of the saplings by the trail were shorter than we were.&lt;/p&gt;
&lt;p&gt;This continues through the second third, where the path follows the thicker forest that flanks both sides of the creek.&lt;/p&gt;
&lt;p&gt;The final push, which is both more exposed and steeper than the rest of the hike, is where the lion&#39;s share of the 1,400ft of elevation are gained. Thanks to a challenging creek crossing, we lose the trail, and start to trailblaze onto the exposed rock. We&#39;re generally aware of the heading of Lake No. 3, but since there&#39;s no clear path ahead, momentum stalls. After a couple of failed attempts at refinding the trail, we decide to continue up the rock until we see trail. While trying to catch up to some folks, two simultaneous calf cramps humble my aggressive catch-up pace. (Thank you Ari and Toby for the electrolytes!)&lt;/p&gt;
&lt;pre class=&quot;plaintext&quot;&gt;&lt;code&gt;Protip: Electrolytes. Water. Potassium. Cramps are the worst.&lt;/code&gt;&lt;/pre&gt;

&lt;p&gt;Finally, we catch the trail as it pokes above the forest and follows a ridge around 200m in front of us. Since it&#39;s a little lower than we got, we trace the topography of the ridge we&#39;re on until the trail reaches our elevation. Trails allow you to put your head down and just pound out distance and elevation. We regain our momentum to push to Lake No. 3. Right at the end of the journey, the trail splits towards the Lake from the Red Peak Trail. If we were to continue, we would reach Lake No. 5. The sky was mostly clear throughout the hike, and the temperature hovered around the upper 70s to mid 80s. We arrive at our campsite around 15:30, amounting to about 4:45 on the trail. We quickly take dives into the lake, preparing for our afternoon there.&lt;/p&gt;
&lt;p&gt;Including us, there were 4 camps on Lake No. 3. For a camping spot by a Lake in the end of July, that&#39;s really good. The guy next to us, Max, turns out to be a solo backpacker. He&#39;s disappointed that the fish are gone. Apparently, Desolation Wilderness is undergoing a frog habitat renewal, meaning the lakes aren&#39;t being stocked as usual.&lt;/p&gt;
&lt;p&gt;The lake sits on the Western slope of Red Peak, and the sunset slowly casts over the rest of Desolation Wilderness.&lt;/p&gt;
&lt;p&gt;Dinner eaten, bear canisters packed by 23:00.&lt;/p&gt;
&lt;p&gt;&lt;img src=&quot;/static/img/0722desolation2.jpeg&quot; alt=&quot;Evergreen woods bathing in the sunset&quot;&gt;&lt;/p&gt;
&lt;h2&gt;July 31st&lt;/h2&gt;
&lt;p&gt;Wake up to incessant buzzing at 7:00. I give up trying to go back to bed and I go to the bathroom, and catch Ari also awake. We take the bear canisters back to camp. Besides the ants in the trash, no animal interference occurred over the night. Toby&#39;s hammock calls to me, and I return to sleep until 9:30 to the sounds of others awake.&lt;/p&gt;
&lt;p&gt;Coffee, oatmeal, and dips into the lake are done by 11:30. We start hiking towards the trailhead at 11:45. The way down is easier, and we identify exactly the place we went off track. We realize the trail is camoflaged by a lattice of dry runoffs for spring snowmelt, which carry sediment down and mimic the gravelly floor of an actual trail.&lt;/p&gt;
&lt;p&gt;Right when it gets flat, Toby and I find this curious mushroom.&lt;/p&gt;
&lt;p&gt;&lt;img src=&quot;/static/img/0722desolation3.jpeg&quot; alt=&quot;Skull-shaped mushroom&quot;&gt;&lt;/p&gt;
&lt;p&gt;Chloe takes this candid photo of me on the way down.&lt;/p&gt;
&lt;p&gt;&lt;img src=&quot;/static/img/0722desolation4.jpeg&quot; alt=&quot;Me in forest&quot;&gt;&lt;/p&gt;
&lt;p&gt;Toby catches this amazing pull-focused shot.&lt;/p&gt;
&lt;p&gt;&lt;img src=&quot;/static/img/0722desolation5.jpeg&quot; alt=&quot;A wildflower is foregrounded in front of a party of hikers&quot;&gt;&lt;/p&gt;
&lt;p&gt;We reach the cars at around 15:00. Some lunch, repacking, goodbyes, and then back home. Stu, Chloe, Toby and I get to Berkeley for pizza. Delicious.&lt;/p&gt;
</description>
      <pubDate>Mon Aug 01 2022 15:57:32 GMT-0700 (Pacific Daylight Time)</pubDate>
    </item>
    <item>
      <title>Using Github as a CMS with Fresh</title>
      <link>https://kojinglick.com/blog/using-github-as-cms</link>
      <description>&lt;h2&gt;Something Mind-Bogglingly Easy&lt;/h2&gt;
&lt;p&gt;When I build a blog for a client, I try to consider a couple of things beforehand:&lt;/p&gt;
&lt;ol&gt;
&lt;li&gt;Where will I store the content?&lt;/li&gt;
&lt;li&gt;How will I access the content?&lt;/li&gt;
&lt;li&gt;How will I create the content?&lt;/li&gt;
&lt;/ol&gt;
&lt;p&gt;By considering these things, I can settle on a front-end/content management stack that is effective at completing the task at hand. For example, when building &lt;a href=&quot;https://privacycode.ai/&quot; target=&quot;_blank&quot; rel=&quot;noopener noreferrer&quot;&gt;PrivacyCode&lt;/a&gt;, I settled on using a server-side rendered &lt;a href=&quot;https://reactjs.org/&quot; target=&quot;_blank&quot; rel=&quot;noopener noreferrer&quot;&gt;React&lt;/a&gt; application with a &lt;a href=&quot;https://sanity.io/&quot; target=&quot;_blank&quot; rel=&quot;noopener noreferrer&quot;&gt;Sanity&lt;/a&gt; content management system by answering those questions as follows:&lt;/p&gt;
&lt;ol&gt;
&lt;li&gt;I need to store it somewhere persistent, secure, and offsite.&lt;/li&gt;
&lt;li&gt;I need a secure way of accessing the data, like GraphQL or a REST API.&lt;/li&gt;
&lt;li&gt;Content needs to be draftable by a content team.&lt;/li&gt;
&lt;/ol&gt;
&lt;p&gt;But for myself, these considerations are a little overkill. I&#39;m not sharing anything critical to &lt;a href=&quot;https://moonstripe.com/&quot; target=&quot;_blank&quot; rel=&quot;noopener noreferrer&quot;&gt;my business&lt;/a&gt;, and I don&#39;t have a content team to worry about. I&#39;m optimizing for convenience, not security or propriety. Besides, as I add complexity to this blog project, it becomes harder to actually keep to a regular posting pattern. I actually do want to use it to chronicle what&#39;s going on in my life. Let&#39;s go back to those questions:&lt;/p&gt;
&lt;ol&gt;
&lt;li&gt;I can store my content somewhere easy and accessible.&lt;/li&gt;
&lt;li&gt;I can access the content through a filesystem, rather than deal with a REST API or GraphQL.&lt;/li&gt;
&lt;li&gt;I can create a Markdown file in that filesystem whenever I want.&lt;/li&gt;
&lt;/ol&gt;
&lt;p&gt;I&#39;ve dreamt of actually implementing something that looks decent with a skeleton this simple. Here&#39;s how I implemented it.&lt;/p&gt;
&lt;h2&gt;Fresh and Deno Deploy&lt;/h2&gt;
&lt;p&gt;I still haven&#39;t gotten to anything related to Github, but hopefully the groundwork I&#39;ve laid will come in handy. Given that I&#39;m starting with convenience, Fresh, a brand-new, Deno-based web framework, sounded like a great place to start. Not only is it basically a bare-bones, static version of React, it&#39;s also built in Deno, giving me access to Deno Deploy, a crazy awesome solution to hosting your content on the edge. I&#39;ve seen &lt;a href=&quot;https://t3-fresh-test.deno.dev/&quot; target=&quot;_blank&quot; rel=&quot;noopener noreferrer&quot;&gt;demos&lt;/a&gt; of how fast these Deno containers can be spun up to serve content, so I wanted to see it for myself.&lt;/p&gt;
&lt;h3&gt;Fresh&lt;/h3&gt;
&lt;p&gt;&lt;a href=&quot;https://fresh.deno.dev/&quot; target=&quot;_blank&quot; rel=&quot;noopener noreferrer&quot;&gt;Fresh&lt;/a&gt; is based on &lt;a href=&quot;https://preactjs.com/&quot; target=&quot;_blank&quot; rel=&quot;noopener noreferrer&quot;&gt;preact&lt;/a&gt;, so it&#39;s essentially a bare-bones, SSR only, React-ish application. It has some quirks to make it so bare-bones, like its novel &lt;a href=&quot;https://fresh.deno.dev/docs/concepts/islands&quot; target=&quot;_blank&quot; rel=&quot;noopener noreferrer&quot;&gt;islands&lt;/a&gt; system, which forces you to consider exactly where you want interactivity and the full suite of React-based tools.&lt;/p&gt;
&lt;h3&gt;Deno Deploy&lt;/h3&gt;
&lt;p&gt;&lt;a href=&quot;https://deno.com/deploy/docs&quot; target=&quot;_blank&quot; rel=&quot;noopener noreferrer&quot;&gt;Deno Deploy&lt;/a&gt; is a cool new edge deployment solutions specifically for &lt;a href=&quot;https://deno.land&quot; target=&quot;_blank&quot; rel=&quot;noopener noreferrer&quot;&gt;Deno&lt;/a&gt; runtimes. They&#39;re super fast at this point, and allow you to use a free .deno.dev domain, or connect your own. Pricing is generous, and I don&#39;t anticipate breaching the freemium threshold.&lt;/p&gt;
&lt;h2&gt;Building the Blog&lt;/h2&gt;
&lt;p&gt;Without further ado, let&#39;s get developing. Make sure you&#39;ve &lt;a href=&quot;https://deno.land/#installation&quot; target=&quot;_blank&quot; rel=&quot;noopener noreferrer&quot;&gt;installed the Deno runtime&lt;/a&gt; with the following script.&lt;/p&gt;
&lt;pre class=&quot;bash&quot;&gt;&lt;code&gt;deno --version&lt;/code&gt;&lt;/pre&gt;

&lt;p&gt;This command returns useful information, like the Deno version, v8 Engine version, and Typescript version.&lt;/p&gt;
&lt;h3&gt;Initialize your Project&lt;/h3&gt;
&lt;p&gt;The first time you do this, briefly consider all of the ways you&#39;ve started React applications in the past. Maybe you&#39;re still new and you haven&#39;t found an alternative to Create-React-App yet. Maybe you&#39;re an absolute React wizard and you write your own babel and webpack configuration files. Regardless, think about all you did before seeing that spinning atom. &lt;/p&gt;
&lt;p&gt;All you need to do to start a new fresh project is the following script:&lt;/p&gt;
&lt;pre class=&quot;bash&quot;&gt;&lt;code&gt;deno run -A -r https://fresh.deno.dev my-blog&lt;/code&gt;&lt;/pre&gt;  

&lt;p&gt;This copies the &lt;a href=&quot;https://fresh-demo.deno.dev/&quot; target=&quot;_blank&quot; rel=&quot;noopener noreferrer&quot;&gt;Fresh demo&lt;/a&gt; into a new directory called my-blog. Change directories to &quot;my-blog&quot; and start the demo:&lt;/p&gt;
&lt;pre class=&quot;bash&quot;&gt;&lt;code&gt;cd my-blog
deno task start&lt;/code&gt;&lt;/pre&gt;

&lt;p&gt;It will prompt you twice. First, it will ask if you want to use &lt;a href=&quot;https://twind.dev&quot; target=&quot;_blank&quot; rel=&quot;noopener noreferrer&quot;&gt;@twind&lt;/a&gt;, a css-in-js version of &lt;a href=&quot;https://tailwindcss.com&quot; target=&quot;_blank&quot; rel=&quot;noopener noreferrer&quot;&gt;Tailwind CSS&lt;/a&gt;. Type &lt;code&gt;y&lt;/code&gt; to include. Then, it will ask if you use VS Code. Type &lt;code&gt;y&lt;/code&gt; to include.&lt;/p&gt;
&lt;p&gt;Pause a moment. Take a deep breath. Center yourself. You haven&#39;t typed the letters &lt;code&gt;npm&lt;/code&gt; or &lt;code&gt;npm -y init&lt;/code&gt; or &lt;code&gt;npm install&lt;/code&gt;. That&#39;s the beauty of &lt;a href=&quot;https://deno.land&quot; target=&quot;_blank&quot; rel=&quot;noopener noreferrer&quot;&gt;Deno&lt;/a&gt;.&lt;/p&gt;
&lt;h3&gt;Fresh&#39;s File Structure&lt;/h3&gt;
&lt;p&gt;When you initialized the Fresh project, it set up the following file structure in the my-blog directory:&lt;/p&gt;
&lt;pre class=&quot;plaintext&quot;&gt;&lt;code&gt;my-blog
+-- /islands
|   +-- Counter.tsx &lt;- interactive counter
+-- /routes
|   +-- /api
|       +--- joke.tsx &lt;- example handler for fetching
|   +-- [name] &lt;- loads pathname &quot;/[name]&quot;
|   +-- index.tsx &lt;- loads pathname &quot;/&quot;
+-- /static
|   +-- favicon.ico
|   +-- logo.svg
+-- /utils
|   +-- twind.ts &lt;- entrypoint for your styling
+-- deno.json &lt;- stores commands and import map
+-- dev.ts &lt;- set up development environment
+-- fresh.gen.ts &lt;- auto-generated map for components
+-- import_map.json &lt;- third-party Deno modules
+-- main.ts &lt;- entrypoint for your app
+-- README.md&lt;/code&gt;&lt;/pre&gt;

&lt;p&gt;For the sake of my blog project, I added two more folders: &quot;/components&quot; to store my non-interactive, layout components and &quot;/content&quot; to store my Markdown files. In the &quot;/content&quot; folder, add a new file called &quot;welcome.md&quot; and populate it with:&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;content/welcome.md&lt;/p&gt;
&lt;/blockquote&gt;
&lt;pre class=&quot;markdown&quot;&gt;&lt;code&gt;
# First Post

[The Current Time and Date]

Hello World!&lt;/code&gt;&lt;/pre&gt; 

&lt;p&gt;I also removed the &quot;/routes/api&quot; folder and the &quot;/islands/Counter.tsx&quot; file, because they aren&#39;t essential to a blog, and those jokes are super corny and lame.&lt;/p&gt;
&lt;h3&gt;Building Out the Blog Index&lt;/h3&gt;
&lt;p&gt;If you&#39;re used to the flexibility of &lt;a href=&quot;https://reactrouter.com/&quot; target=&quot;_blank&quot; rel=&quot;noreferrer noopener&quot;&gt;React Router DOM&lt;/a&gt;, you won&#39;t get that here. Instead Fresh internal routing is a lot more like &lt;a href=&quot;https://nextjs.org/&quot; target=&quot;_blank&quot; rel=&quot;noreferrer noopener&quot;&gt;Next.js&lt;/a&gt;, where the file structure templates out your application&#39;s routes. &lt;/p&gt;
&lt;p&gt;Since the goal is simplicity, we&#39;re going to list out the blog contents on the home page. No need to get complicated with the routing, but I push you to add an &quot;about&quot; page, and make the home page as pretty as you can.&lt;/p&gt;
&lt;p&gt;In &quot;/routes/index.tsx&quot;, we need to set up the preact component like this:&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;routes/index.tsx&lt;/p&gt;
&lt;/blockquote&gt;
&lt;pre&gt;&lt;code&gt;/** @jsx h */
import { h, Fragment } from &quot;preact&quot;;
import { tw } from &quot;@twind&quot;;
import { Handlers, PageProps } from &quot;fresh/server.ts&quot;;
&lt;/code&gt;&lt;/pre&gt;
&lt;p&gt;The first two lines set up preact&#39;s rendering engine, &quot;h&quot;, and gives us access to the Fragment component. We won&#39;t worry too much about that here, but feel free to read &lt;a href=&quot;https://jasonformat.com/wtf-is-jsx/&quot; target=&quot;_blank&quot; rel=&quot;noreferrer noopener&quot;&gt;more&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;The third line sets up our css-in-js solution, @twind.&lt;/p&gt;
&lt;p&gt;Finally, we import the Handlers and PageProps types from &quot;fresh/server.ts&quot;.&lt;/p&gt;
&lt;p&gt;Next, let&#39;s build the handler that populates our page with our blog content. In &quot;myblog/routes/index.tsx&quot;, add the following:&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;routes/index.tsx&lt;/p&gt;
&lt;/blockquote&gt;
&lt;pre&gt;&lt;code&gt;interface Post {
    slug: string,
    date: string,
    title: string
}

export const handler: Handlers = {
    async GET(req, ctx) {

        const blogArticles: Post[] = [];

        for await (const item of Deno.readDir(&#39;content/&#39;)) {
            if (item.isFile) {
                // console.log(item.name)
                const path = `content/${item.name}`
                const file = await Deno.readTextFile(path);
                const titleString = file.split(&quot;\n&quot;)[0];
                const dateString = file.split(&quot;\n&quot;)[2]

                blogArticles.push({
                    slug: item.name,
                    date: dateString,
                    title: titleString
                });
            }
        }

        return ctx.render({ blogArticles })
    },
};
&lt;/code&gt;&lt;/pre&gt;
&lt;p&gt;The interface, Post, will help us know exactly what is relevant in each blog post.&lt;/p&gt;
&lt;p&gt;The exported GET handler creates an array of Posts called &lt;code&gt;blog articles&lt;/code&gt; and populates it with the the slug or pathname, date, and title of each post in &quot;/content&quot;. Finally, we return the array in the render method of the second argument of the handler, ctx. This will pass our &lt;code&gt;blog articles&lt;/code&gt; to our component.&lt;/p&gt;
&lt;p&gt;Finally, we&#39;ll add the component itself:&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;routes/index.tsx&lt;/p&gt;
&lt;/blockquote&gt;
&lt;pre&gt;&lt;code&gt;export default ({ data }: PageProps) =&gt; {
    return (
        &lt;Fragment&gt;
            &lt;h1&gt;Thoughts&lt;/h1&gt;
            {
                data.blogArticles.map((e: Post) =&gt; (
                    &lt;div&gt;
                        &lt;a href={`/${e.slug.split(&#39;.&#39;)[0]}`}&gt;
                            &lt;h1&gt;{e.title.slice(2, e.title.length)}&lt;/h1&gt;
                            &lt;p&gt;{e.date}&lt;/p&gt;
                        &lt;/a&gt;
                    &lt;/div&gt;
                ))
            }
        &lt;/Fragment&gt;
    );
}
&lt;/code&gt;&lt;/pre&gt;
&lt;p&gt;When you pass information from your handler to your component, the data is stored in the &lt;code&gt;data&lt;/code&gt; property of your component&#39;s &lt;code&gt;props&lt;/code&gt;. Notice that we destructure data from the props of the unnamed render function to use in the render loop.&lt;/p&gt;
&lt;p&gt;When you run the script:&lt;/p&gt;
&lt;pre class=&quot;bash&quot;&gt;&lt;code&gt;deno task start&lt;/code&gt;&lt;/pre&gt;

&lt;p&gt;You should see a white page with your &quot;Thoughts&quot; header and information about your first &quot;/content/welcome.md&quot; post. If you click on the text, you will be taken to &quot;localhost:800/welcome/&quot;, where you&#39;ll be greeted with a &quot;Hello welcome&quot;.&lt;/p&gt;
&lt;p&gt;Let&#39;s build that content page.&lt;/p&gt;
&lt;h3&gt;Making the Post Island&lt;/h3&gt;
&lt;p&gt;In order to build our blog routes, we need to take a brief detour into islands world.&lt;/p&gt;
&lt;p&gt;In Fresh, interactivity like programatically rendering inner HTML requires a little bit of running around. Only components described in islands can have these interactive components. But here&#39;s the catch, islands can&#39;t pass complex props. They can only pass simple primitives like strings, arrays, and objects.&lt;/p&gt;
&lt;p&gt;Let&#39;s build our first island called Post by creating the file &quot;/islands/Post.tsx&quot;.&lt;/p&gt;
&lt;p&gt;Islands are started like any other component, but now we can import hooks:&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;islands/Post.tsx&lt;/p&gt;
&lt;/blockquote&gt;
&lt;pre&gt;&lt;code&gt;/** @jsx h */
import { h, Fragment } from &quot;preact&quot;;
import { tw } from &quot;@twind&quot;;
import { useRef, useLayoutEffect } from &quot;preact/hooks&quot;;
&lt;/code&gt;&lt;/pre&gt;
&lt;p&gt;This component will take in raw Markdown (here called &quot;markup&quot;) and output an article.&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;islands/Post.tsx&lt;/p&gt;
&lt;/blockquote&gt;
&lt;pre&gt;&lt;code&gt;export default function Post({ markup }) {
    const el = useRef&lt;HTMLDivElement&gt;(null)
    console.log(&#39;post&#39;, markup)
    useLayoutEffect(() =&gt; {
        if (el.current) {
            el.current.innerHTML = markup;
        }

    }, [])

    return (
        &lt;Fragment&gt;
            &lt;article ref={el} /&gt;
        &lt;/Fragment&gt;
    );
}
&lt;/code&gt;&lt;/pre&gt;
&lt;h3&gt;Building Out the Blog Routes&lt;/h3&gt;
&lt;p&gt;Making an individual blog article is like when you created the blog index, but this time we&#39;re also importing a Markdown parser. We do this by adding the following line to &quot;import_map.json&quot; like this:&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;import_map.json&lt;/p&gt;
&lt;/blockquote&gt;
&lt;pre&gt;&lt;code&gt;{
    &quot;imports&quot;: {
        &quot;fresh/&quot;: &quot;https://deno.land/x/fresh@1.0.1/&quot;,
        &quot;preact&quot;: &quot;https://esm.sh/preact@10.8.2&quot;,
        &quot;preact/&quot;: &quot;https://esm.sh/preact@10.8.2/&quot;,
        &quot;preact-render-to-string&quot;: &quot;https://esm.sh/preact-render-to-string@5.2.0?deps=preact@10.8.2&quot;,
        &quot;@twind&quot;: &quot;./utils/twind.ts&quot;,
        &quot;twind&quot;: &quot;https://esm.sh/twind@0.16.17&quot;,
        &quot;twind/&quot;: &quot;https://esm.sh/twind@0.16.17/&quot;,
        &quot;markdown&quot;: &quot;https://deno.land/x/markdown@v2.0.0/mod.ts&quot;, // our new module from deno.land/x cdn
    }
}
&lt;/code&gt;&lt;/pre&gt;
&lt;p&gt;To be able to view the content of the Markdown article from your new Post island, we need to alter the &quot;/routes/[name].tsx&quot;. Initialize the file we did before:&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;routes/[name].tsx&lt;/p&gt;
&lt;/blockquote&gt;
&lt;pre&gt;&lt;code&gt;/** @jsx h */ 
import { h, Fragment } from &quot;preact&quot;;
import { Handlers, PageProps } from &quot;$fresh/server.ts&quot;;
import { Marked } from &quot;markdown&quot;;
import Post from &#39;../islands/Post.tsx&#39;
&lt;/code&gt;&lt;/pre&gt;
&lt;p&gt;Next, we&#39;re going to add another handler, but instead of pulling metadata from our &quot;/content&quot; folder, we&#39;ll be extracting the Markdown content into our component in the &lt;code&gt;markup&lt;/code&gt; property. You can choose to make another type interface for the content, but since I&#39;m not looping over anything, it&#39;s trival to remember that the markup content is stored in &lt;code&gt;props.data.markup&lt;/code&gt;.&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;routes/[name].tsx&lt;/p&gt;
&lt;/blockquote&gt;
&lt;pre&gt;&lt;code&gt;export const handler: Handlers = {
async GET(req, ctx) {
    const url = new URL(req.url).pathname.split(&#39;/&#39;)
    const file = url[1]

    const decoder = new TextDecoder(&quot;utf-8&quot;);
    const markdown = decoder.decode(await Deno.readFile(`./content/${file}.md`));
    const markup = Marked.parse(markdown)

    console.log(markup)

    return ctx.render({ markup: markup.content })
},
};
&lt;/code&gt;&lt;/pre&gt;
&lt;p&gt;Finally, let&#39;s render the post in this component.&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;routes/[name].tsx&lt;/p&gt;
&lt;/blockquote&gt;
&lt;pre&gt;&lt;code&gt;export default ({ data }) =&gt; {
    return (
        &lt;Fragment&gt;
            &lt;a href=&#39;/&#39;&gt;back to home&lt;/a&gt;
            &lt;Post markup={data.markup}/&gt;
        &lt;/Fragment&gt;
    );
}
&lt;/code&gt;&lt;/pre&gt;
&lt;p&gt;Now, when we click on the first post in our blog index, it&#39;ll display a page with the post content! Very nice.&lt;/p&gt;
&lt;h3&gt;Deploying to Deno Deploy&lt;/h3&gt;
&lt;p&gt;There are some things we need to do to prepare our deployment.&lt;/p&gt;
&lt;p&gt;First, we need to alter our &quot;import_map.json&quot; to affix our &quot;fresh/server.ts&quot; import. Right now, it&#39;s relative thanks to the &quot;$&quot;. Make sure you&#39;re also removing the &quot;$&quot; each import across your app.&lt;/p&gt;
&lt;p&gt;Next, let&#39;s change the &quot;deno.json&quot;. Right now, the &lt;code&gt;start&lt;/code&gt; command runs the development environment. Replace &lt;code&gt;start&lt;/code&gt; with &lt;code&gt;start:dev&lt;/code&gt; as the command that starts your development environment. Add a new &lt;code&gt;start&lt;/code&gt; command, so your &quot;deno.json&quot; looks like this:&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;deno.json&lt;/p&gt;
&lt;/blockquote&gt;
&lt;pre&gt;&lt;code&gt;{
    &quot;tasks&quot;: {
        &quot;start:dev&quot;: &quot;deno run -A --watch=static/,routes/ dev.ts&quot;,
        &quot;start&quot;: &quot;deno run -A --watch=static/,routes/ main.ts&quot;
    },
    &quot;importMap&quot;: &quot;import_map.json&quot;
}
&lt;/code&gt;&lt;/pre&gt;
&lt;p&gt;Finally, create an account on &lt;a href=&quot;https://deno.com/deploy/docs&quot; target=&quot;_blank&quot; rel=&quot;noopener noreferrer&quot;&gt;Deno Deploy&lt;/a&gt; by clicking the &quot;Sign In&quot; link. Deploying to Deno Deploy is as easy as connecting your Github account, selecting the repository that has your blog, and clicking &quot;Link&quot;. Read more &lt;a href=&quot;https://deno.com/deploy/docs&quot; target=&quot;_blank&quot; rel=&quot;noopener noreferrer&quot;&gt;here&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;In a few seconds, you should be able to see your new blog, hosted on &quot;[slug].deno.dev&quot;.&lt;/p&gt;
&lt;h2&gt;Conclusion&lt;/h2&gt;
&lt;p&gt;Hopefully, your mind is as blown as mine was when I cranked this project out in 4 hours. Fresh and Deno provide a very streamlined DX that is hard to beat, especially given what&#39;s out there in the space currently.&lt;/p&gt;
&lt;p&gt;You can find the source code for the project &lt;a href=&quot;https://github.com/moonstripe/demo-github-cms&quot; target=&quot;_blank&quot; rel=&quot;noopener noreferrer&quot;&gt;here&lt;/a&gt; and a live demo of the blog &lt;a href=&quot;https://simple-blog.deno.dev/welcome&quot; target=&quot;_blank&quot; rel=&quot;noopener noreferrer&quot;&gt;here&lt;/a&gt;.&lt;/p&gt;
</description>
      <pubDate>Fri Jul 29 2022 14:28:32 GMT-0700 (Pacific Daylight Time)</pubDate>
    </item>
    <item>
      <title>Syntax Highlighting</title>
      <link>https://kojinglick.com/blog/syntax-highlighting-test</link>
      <description>&lt;p&gt;Hopefully the below code renders as Typescript:&lt;/p&gt;
&lt;pre&gt;&lt;code&gt;let order: number = 0;
order++;
&lt;/code&gt;&lt;/pre&gt;
&lt;p&gt;And the below code renders as Rust:&lt;/p&gt;
&lt;pre&gt;&lt;code&gt;use std::net;

fn main() {
    println!(&quot;Hello World&quot;);
};
&lt;/code&gt;&lt;/pre&gt;
&lt;p&gt;Finally this code should pase as raw HTML:&lt;/p&gt;
&lt;pre&gt;&lt;code&gt;&lt;!DOCTYPE html&gt;
&lt;html&gt;
    &lt;head&gt;
        &lt;title&gt;Title of the document&lt;/title&gt;
    &lt;/head&gt;

    &lt;body&gt;
        &lt;p&gt;
            The content of the document......
        &lt;/p&gt;
        
     &lt;/body&gt;
 &lt;/html&gt;
 
&lt;/code&gt;&lt;/pre&gt;
&lt;p&gt;Time for tutorials&lt;/p&gt;
</description>
      <pubDate>Fri Jul 29 2022 10:45:32 GMT-0700 (Pacific Daylight Time)</pubDate>
    </item>
    <item>
      <title>Welcome</title>
      <link>https://kojinglick.com/blog/welcome</link>
      <description>&lt;p&gt;To everyone and no one, hello. &lt;/p&gt;
&lt;p&gt;On the other side of viral success, view counts in the millions, and the other isolated clusters of immense internet traffic, most of the internet lays empty. I’ve started a handful of online presences over my life: very few garnered even the attention of my in-person friends and family. I still wonder what all of those abandoned blogs are doing. They’re probably now trafficking pornography or other scams.&lt;/p&gt;
&lt;p&gt;Perhaps this too will go the way of my adolescent blogs and course-mandated tweeting. But each time I start one of these, there’s a scene of wonder. Like walking into a desert, I lay feeble claim to a little corner of a vast, unknowable expanse.&lt;/p&gt;
</description>
      <pubDate>Wed Jul 27 2022 23:57:32 GMT-0700 (Pacific Daylight Time)</pubDate>
    </item>
    <item>
      <title>On the Development and Deployment of Cyberwarfare II</title>
      <link>https://kojinglick.com/blog/on-the-development-and-deployment-of-cyberwarefare-II</link>
      <description>&lt;p&gt;On April 2nd, one of the world&#39;s largest wind turbine manufacturers, Nordex, was hit by a cyberattack. The German company&#39;s response took down most of its critical infrastructure down, &lt;a href=&quot;https://www.securityweek.com/wind-turbine-giant-nordex-shuts-down-it-systems-response-cyberattack&quot;&gt;in what some cybersecurity professionals identify as a telltale sign of ransomware&lt;/a&gt;. Although neither the company nor national agencies attribute the attack to any actor in particular, geopolitical indicators point to a connection to the Russian invasion of Ukraine. Regardless of who might be behind the attack, this attack highlights the unique strategic role that cybersecurity will play moving forward.&lt;/p&gt;
&lt;p&gt;One of the most pressing implications of the Ukraine-Russia conflict is the future of European energy security. To keep this description brief, Europeans rely disproportionately on cheap Russian natural gas, and the continued domestic pressures to move away from producing alternative sources of energy like nuclear or natural gas regionally have only made things worse. Russia is therefore in a unique position to deny Europe an essential ingredient to Europe&#39;s standard of living.&lt;/p&gt;
&lt;p&gt;In other words, Moscow is a strategic beneficiary to the recent harm caused by the Nordex intrusion. Without waiting for attribution or the unlikely event of a digital trail to the attackers, strategists in countries across the world should be alarmed by this event. Cyber-war is moving out of its fledgling phase - where it is relegated as a tactical tool - into geopolitical maturity as a strategic threat. &lt;/p&gt;
&lt;p&gt;Lucas Kello, in his essay discussing the unique strategic space that cyber-war creates for itself, notes how cyber-warfare is offense-dominant. He argues that the costs of defending against an attacker are much higher than the costs of attacking an adversary. While I won&#39;t delve too much into his work, he is successful in his argument while addressing counter-points effectively. &lt;/p&gt;
&lt;p&gt;What this means for the geopolitical landscape is significant. Not only can a relatively weak Russia rely on its strategic nuclear weapons in times of catastrophe, attacks like the one on Nordex can prolong the slow deterioration of Russia&#39;s conventional posturing. Russia now has effective strategic tools below the threshold of nuclear war, enabling it to use a substantially cheaper cyber-offense to wrestle concessions out of all but the world&#39;s most cyber-capable and wealthy entities, both public and private. By using cyber-war to jeopardize adversarial energy independence, Russia can alter the conditions of a ceasefire, armistice, or peace treaty by way of the conflict&#39;s mediators. The ability for asymmetrical conflict to exist outside of ostensible war is significant for Russia, as well as other actors who have faced international condemnation. When the lights go out, who is to blame? &lt;/p&gt;
</description>
      <pubDate>Thu Apr 07 2022 14:52:00 GMT-0700 (Pacific Daylight Time)</pubDate>
    </item>
    <item>
      <title>On the Development and Deployment of Cyberwarfare I</title>
      <link>https://kojinglick.com/blog/on-the-development-and-deployment-of-cyberwarefare-I</link>
      <description>&lt;p&gt;I don&#39;t think it&#39;s even necessary at this point to highlight the atrocity of the developing conflict in Ukraine. &quot;War bad&quot; is no longer the hot take it used to be, especially when one side of the conflict is both nuclear-capable and internationally cornered. &lt;a href=&quot;https://time.com/6152076/putin-nuclear-alert-ukraine/&quot;&gt;But as terrifying as the news of Putin readying nuclear forces is&lt;/a&gt;, I am much more concerned with what happened hours before &lt;a href=&quot;https://www.reuters.com/world/europe/russias-putin-authorises-military-operations-donbass-domestic-media-2022-02-24/&quot;&gt;he declared its &quot;special military operation&quot;&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;Discovered on February 23rd, mere hours before the start of the latest round of hostilities, ESET research discovered Hermetic Wiper, malware designed to break into digital storage hardware and simply delete everything there&lt;/p&gt;
&lt;p&gt;&quot;Breaking. &lt;a href=&quot;https://twitter.com/hashtag/ESETResearch?src=hashtag_click&quot;&gt;#ESETResearch&lt;/a&gt; discovered a new data wiper malware used in Ukraine today. ESET telemetry shows that it was installed on hundreds of machines in the country. This follows the DDoS attacks against several Ukrainian websites earlier today 1/n&quot;&lt;/p&gt;
&lt;p&gt;&lt;a href=&quot;https://twitter.com/ESETresearch&quot;&gt;ESETresearch&lt;/a&gt;, &lt;a href=&quot;https://twitter.com/ESETresearch/status/1496581903205511181&quot;&gt;Feb 23&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;Another Twitter user by the name of &lt;a href=&quot;https://twitter.com/fr0gger_&quot;&gt;Thomas Roccia&lt;/a&gt; gave a great summary in how Hermetic Wiper works in the following days: &lt;/p&gt;
&lt;p&gt;&lt;img src=&quot;https://cdn.sanity.io/images/s7hkd94g/production/487ad828a39a3d820fb263d772eb8ec82128ede3-1720x1206.png?w=2000&amp;h=2000&amp;fit=max&quot; alt=&quot;The main impacts of Hermetic Wiper are partition and disk corruption, byte overwriting, and anti-forensics&quot;&gt;&lt;/p&gt;
&lt;p&gt;It&#39;s interesting to track the sudden appearance of a wiper during what can only be described as the year of ransomware. Essentially the difference between a ransomware attack and a wiper is recoverability: ransomware requires hackers to be able to exchange their pilfered data for compensation (typically Bitcoin or other cryptocurrency). Wipers like the one discovered in Ukraine, on the other hand, show no regard for the value of that data. Encrypted data can be recovered. A corrupted hard drive is much harder to ransom. As Roccia notes, one of the benefits of this sabotage is anti-forensic, in other words, covering the attackers&#39; tracks.&lt;/p&gt;
&lt;p&gt;This makes sense in context to Putin&#39;s escalation of the conflict. Rather than &lt;a href=&quot;https://www.nytimes.com/2022/02/23/business/russia-sanctions-cryptocurrency.html&quot;&gt;engaging Russia&#39;s digital capabilities in fundraising&lt;/a&gt; like in recent years, Putin is now gearing the growing cyberwar towards industrial and financial sabotage. We can expect is that these cyberweapons spill over to neighboring countries and perhaps across the world, and given the influx of major cybersecurity incidents in the last week, it&#39;s likely that variants and strains of this malware will be deployed in a similar way to Ukraine: as a clandestine first-strike against possible combatants.&lt;/p&gt;
</description>
      <pubDate>Tue Mar 01 2022 13:30:00 GMT-0800 (Pacific Standard Time)</pubDate>
    </item>
    <item>
      <title>Learning GraphQL and Apollo</title>
      <link>https://kojinglick.com/blog/learning-graphql-and-apollo</link>
      <description>&lt;p&gt;Today was a good day.&lt;/p&gt;
&lt;p&gt;It took me approximately 3 or 4 months to become fully comfortable with the traditional RESTful API and database experience with both SQL and MongoDB, and that was with an instructor paid to help us understand. &lt;/p&gt;
&lt;p&gt;Today, it only took me a couple hours of development time to get my &lt;a href=&quot;https://www.sanity.io/&quot;&gt;sanity.io&lt;/a&gt; content working with this scratch built &lt;a href=&quot;https://reactjs.org/&quot;&gt;React.js&lt;/a&gt; blog using &lt;a href=&quot;https://graphql.org/&quot;&gt;GraphQL&lt;/a&gt; and &lt;a href=&quot;https://www.apollographql.com/&quot;&gt;Apollo&lt;/a&gt;. &lt;/p&gt;
&lt;p&gt;&lt;em&gt;2023 Edit: the current website you&#39;re looking at uses an entirely different stack. See &lt;a href=&quot;https://www.kojinglick.com/using-github-as-cms&quot;&gt;this post&lt;/a&gt;.&lt;/em&gt;&lt;/p&gt;
&lt;p&gt;Of course I&#39;m not remotely a pro at using GraphQL, but I&#39;m starting to see why it&#39;s appearing in more and more tech stacks for web developers. I&#39;ll outline what I&#39;m seeing at this stage, more to see in a few months how much of an absolute newbie I&#39;m being at this moment. &lt;/p&gt;
&lt;p&gt;Compared to RESTful APIs, where most of the logic happens in controllers, middleware, and the arcane realm of the back-end, GraphQL presents itself as easier to manage as a front-end developer. Imagine the following scenario before GraphQL. A client is asking for a simple change to the layout, and as they aren&#39;t web-savvy, they&#39;re doing it in graphical terms. I want the categories of the posts to be &lt;em&gt;here&lt;/em&gt;, the guy points to a spot on the mock up. The front-end developer, who the client is facing, nods their head but they know that in order to access that information, (let&#39;s assume at the very least it is stored in the database,) they&#39;re going to have to chat with the person responsible for building the controller for the API, who at the very least needs to add that requested information, namely the string values of the categories. &lt;/p&gt;
&lt;p&gt;With GraphQL, all of the information that is stored in the database is readily accessible to the front-end developer. As soon as the client makes that request, the front-end developer can restructure their query to include the information, making it much smoother of a process to access that information and iterate in terms that are comfortable with the tech-savless client. &lt;/p&gt;
&lt;p&gt;Well, at least that&#39;s how I imagine the strengths of this query language, you know, after using it for like maybe 90 minutes. I&#39;m excited to find the limitations of the language, especially since I found out that a company that my friend is working at is actually switching back from GraphQL to RESTful APIs to transport data to the front-end. It seems, though, that in his case, he was dealing with lazy use of the query language, where the front-end folks were asking for way too much information and really blowing the performance of their product. GraphQL does seem a tad slower than a RESTful API, but for most use cases, I don&#39;t think it&#39;s a major performance issue.&lt;/p&gt;
</description>
      <pubDate>Tue Sep 28 2021 20:07:00 GMT-0700 (Pacific Daylight Time)</pubDate>
    </item>
  </channel>
</rss>