Author: Shruti Pandey
-
In-Context Learning for LLMs: Zero, One, and Few Shot Learning Examples
In-context learning is a powerful tool that has revolutionized the field of natural language processing (NLP). It allows language models to learn from context and adapt their behavior accordingly. In this blog post, I will discuss the relevance of in-context learning to large language models (LLMs) and explore the concept of in-context learning with zero-shot,…
-
Transformer Architecture: Explained
The world of natural language processing (NLP) has been revolutionized by the advent of transformer architecture, a deep learning model that has fundamentally changed how computers understand human language. I find this topic fascinating because it blends complex computational models with real-world applications, effectively allowing machines to interpret and generate text with a level of…
-
Everything you wanna know about prompting and prompt engineering
The intersection of language and technology has always fascinated me, especially with the rise of Large Language Models (LLMs), which have fundamentally changed how we interact with machines. When I started machine learning, it was mostly people working in data that routinely interacted with AI, but now everyone has access to AI systems on their…
-
Evolution of Language Models: From N-Grams to Neural Networks to Transformers
I am sure you have caught up on the buzz around Large Language Models (LLMs), the backbone of your favorite tool, ChatGPT and so I thought this might be a good time to write a refresher on language models, which includes their definition, evolution, application, and challenges. From n-gram to transformer models like GPT-3 and…
-
How is AI compromising Consumer Privacy?
I was out for dinner with my friends and I was humming “Take Five” with my jazz-fanatic friend. Another friend asked us the name of the song and I said it’s Take Five. The next day, the same friend received an ad from Take Five, the oil-changing company. This was terrifying. My fears about devices…
-
Can we disassociate race from AI?
This semester (Spring 2023), I got an opportunity to assist Dr. Charmaine Royal with her course Race, Genomics, and Society. We had our last class in the past week and Dr. Royal asked us to take the learnings from the class to real-world. In other words, moving from bench side to curbside when it comes…
-
QUITE Framework for Data Exploration with example
The first time I mention EDA (Exploratory Data Analysis) or Data Exploration to people who want to tell stories with data, I see expressions of confusion. I can totally understand why. I have been working with data for over 6 years now, (including an advanced degree in Data Science) and I still get overwhelmed when…
-
Missing Women In Tech
I have been wanting to write this article ever since I had a conversation with my friend about the gender inequality that exists in tech. She is a CS grad and I am an ECE grad, both with a master’s degree and we were talking about how insecure we feel about our mathematical and coding…
-
Why Governments should focus on building Data Infrastructure
Claude Shanon, in Information theory, asserts that as the level of information increases the uncertainty related to that event decreases. Going by the dictum, in a highly uncertain event like COVID-19, there is a lot of information to be gained. Countries like South Korea and Taiwan were severely affected by MERS and SARS in the…
-
Making a Case for an ADHD-Inclusive World
Duke University Library is a treasure where I have discovered thought-provoking, mood-lifting, and spiritually enriching books. Last summer, I came across a book in the aisles of the library — CrazyBusy: Overstretched, Overbooked, and About to Snap! Strategies for Coping in a World Gone ADD What caught my eye was the phrase on the cover,…