Tumgik
#books:
tksstgiftguide · 5 months
Photo
Tumblr media
We have three of Theodore Gray’s science and technology books, and they’re fantastic for all ages to explore. How Things Work will be an excellent 256-page addition to our collection this year. 
Get this on Amazon
Get this at Bookshop
4 notes · View notes
malicedafirenze · 10 months
Text
The Sharing Knife by Lois McMaster Bujold - Recommendation and Discussion
I'm backing up all my book reviews from reddit. This was originally posted on /r/Fantasy on 2018/03/18
I already made a post about the first book in the series a few weeks, but I want to elaborate now that I've read the whole series.
I'll try to make this post work as a recommendation for people who haven't read it, so spoilers will be tagged.
The Basics:
The Sharing Knife is a story about a young farmer girl who meets an older lakewalker, a kind of soldier-sorcerer. They fall in love and eventually realize that magical and non-magical people need to work together against the malices (life-stealing magic entities) or both will be doomed.
The sub-genre would be best described as romance/adventure, I suppose, with a lot of the plot focusing on relatively "mundane" events, but with bits of action here and there.
Themes and Scope: I liked that Sharing Knife is pretty "slice of life", even though the later books lead on that some fairly world-changing events are happening, or at least being set into motion. The overall focus on the relationship between the cultures of lakewalkers and farmers makes for pretty interesting worldbuilding I think.
Not a Standalone: It's obviously a series, but I thought it more extreme for this one than for other series that the first book cannot and should not really stand on its own. As I explained in my other post, I was somewhat underwhelmed by the plot of the first book because basically all the action happens right at the start and I'm not a big fan of plots solely about weddings. The balance works a lot better if you look at the whole series as two big volumes (one/two and three/four) or even one big story.
Family Issues: Both protagonist have a number of issues with their families on a very relatable scale. Fawn's issues in the first book particularly resonated with me, where she is starting to become more confident once away from her family, but slips back into old insecure habits once she is back in those surroundings. I've been there both with some members of my family and with certain groups of friends, and found it incredibly relatable to read: this feeling that you don't particularly like yourself around a certain group of people or that you just lose all confidence if you're around people who have a tendency to pick on you, even if it's for minor things.
Ground Sense: I loved the workings and descriptions of Ground Sense (the lakewalkers' magic, their underlying sense of the 'spirit' of everything around them, but especially living things). Some of the concepts, like changing the Ground / essence of something in the 'spiritual world' seemed really familiar to me, but I'm not sure where to place it. I guess Shadesmar and Soulcasting from the Stormlight Archive have similarities, but I'm not sure if it doesn't also remind me of something else. What other books have a 'magic system' where everything has an 'essence' or 'spirit' and the physical of it can be changed by modifying that spirit, if you have the skill/magic powers to do so?
Romance: I always find myself pining for more romance in most of the fantasy that I read, and I quite liked it in Sharing Knife, after getting over the characters' age gap. Because the later books' "drama" is much more focused on the world around the characters rather than any conflict between them, the romance got much less prominent. Which is nice, I guess. I wouldn't have wanted there to be any artificial drama between them, but it's not really romance anymore if it's just a story about a couple? Idk, I like more turbulent relationships I guess, but it was also nice for a change to just have an established couple with no pointless issues between them.
'Women's Health': I liked that pregnancy and miscarriage are both fairly big themes/plot points at some point in the series, since that is something so often left out of fantasy/adventure books. Sharing Knife is pretty 'open' about such things and I thought that was a fresh change. very minor complaint
Gender Roles: Sharing Knife does a pretty amazing job of having female characters who fit into traditional gender roles and expectations (Fawn, for example) while still letting them be interesting and relatable characters. There's no 'not like the other girls' and no 'strong independent woman ^^tm syndrome, but there are women who patrol and fight etc. and both that and being a traditional housewife are presented without any sort of judgement for the other, which I think is really nice since a lot of fiction still tends to look down on traditional femininity.
Book 4 Action: All the books have their action sequences with a real sense of danger, but hot damn I didn't expect shit to get that real in book four. book 4 spoiler
Sequels: I thought the events set into motion towards the end of the series, meaning book 4 spoilers, would make for a super interesting sequel, perhaps set a few decades later.
I really enjoyed Sharing Knife in a... special kind of way. It was all really warming, somehow. There's conflict and danger sometimes, but all in all it's a really comfortable-feeling story, it's got a certain kind of coziness to it.
I'd definitely recommend it to anyone who likes romance and slice-of-life types of stories.
So yeah thank you /r/fantasy, as usual, for recommending good books to me and I hope to pass on the favor :D
Edit: forgot to mention: I listened to the whole series as audiobook read by Bernadette Dunne. Very good audiobook and pleasant narrator :)
2 notes · View notes
aesharecom · 17 hours
Text
Kinetic Typography V1 Premiere Pro | VideoHive 40473185
Tumblr media Tumblr media
Files Included Motion Graphics Template Files Software Version Premiere Pro CC, After Effects CC Resolution 1920x1080 File Size 4mb Direct Download Direct Download is available for Premium Users only Read the full article
0 notes
jcmarchi · 18 days
Text
A new world of 2D material is opening up - Technology Org
New Post has been published on https://thedigitalinsider.com/a-new-world-of-2d-material-is-opening-up-technology-org/
A new world of 2D material is opening up - Technology Org
Materials that are incredibly thin, only a few atoms thick, exhibit unique properties that make them appealing for energy storage, catalysis and water purification. Researchers at Linköping University have now developed a method that enables the synthesis of hundreds of new 2D materials. Their study has been published in the journal Science.
In a film that measures only a single millimetre, there can be millions of layers of 2D materials which generates its unique properties. Image credit: Olov Planthaber/Linköping University
Since the discovery of graphene, the field of research in extremely thin materials, so-called 2D materials, has increased exponentially. The reason is that 2D materials have a large surface area in relation to their volume or weight. This gives rise to a range of physical phenomena and distinctive properties, such as good conductivity, high strength or heat resistance, making 2D materials of interest both within fundamental research and applications.
“In a film that’s only a millimetre thin, there can be millions of layers of the material. Between the layers there can be a lot of chemical reactions and thanks to this, 2D materials can be used for energy storage or for generating fuels, for example,” says Johanna Rosén, professor in Materials physics at Linköping University.
Three-step process
The largest family of 2D materials is called MXenes. MXenes are created from a three-dimensional parent material called a MAX phase. It consists of three different elements: M is a transition metal, A is an (A-group) element, and X is carbon or nitrogen. By removing the A element with acids (exfoliation), a two-dimensional material is created. Until now, MXenes has been the only material family created in this way.
The Linköping researchers have introduced a theoretical method for predicting other three-dimensional materials that may be suitable for conversion into 2D materials. They have also proved that the theoretical model is consistent with reality.
To succeed, the researchers used a three-step process. In the first step, they developed a theoretical model to predict which parent materials would be suitable. Using large-scale calculations at the National Supercomputer Centre, the researchers were able to identify 119 promising 3D materials from a database and a selection consisting of 66,643 materials.
From theory to lab
The next step was to try to create the material in the lab.
“Out of 119 possible materials, we studied which ones had the chemical stability required and which materials were the best candidates. First, we had to synthesise the 3D material, which was a challenge in itself. Finally, we had a high-quality sample where we could exfoliate and etch away a specific atom layers using hydrofluoric acid,” says Jie Zhou, assistant professor at the Department of Physics, Chemistry and Biology.
The researchers removed yttrium (Y) from the parent material YRu2Si2, which resulted in the formation of two-dimensional Ru2SixOy.
But to confirm success in the lab, verification is necessary – step three. The researchers used the scanning transmission electron microscope Arwen at Linköping University. It can examine materials and their structures down at the atomic level. In Arwen it is also possible to investigate which atoms a material is made up of using spectroscopy.
“We were able to confirm that our theoretical model worked well, and that the resulting material consisted of the correct atoms. After exfoliation, images of the material resembled the pages of a book. It’s amazing that the theory could be put into practice, thereby expanding the concept of chemical exfoliation to more materials families than MXenes,” says Jonas Björk, associate professor at the division of Materials design.
Endless applications
The researchers’ discovery means that many more 2D materials with unique properties are within reach. These, in turn, can lay the foundation for a plethora of technological applications. The next step for the researchers is to explore more potential precursor materials and scale up the experiments. Johanna Rosén believes that future applications are almost endless.
“In general, 2D materials have shown great potential for an enormous number of applications. You can imagine capturing carbon dioxide or purifying water, for example. Now it’s about scaling up the synthesis and doing it in a sustainable way,” says Johanna Rosén.
The study was funded by the Knut and Alice Wallenberg Foundation, the Wallenberg Initiative Materials Science for Sustainability (WISE), the Göran Gustafsson Foundation for Research in Natural Sciences and Medicine, the Swedish Foundation for Strategic Research, the European Union, the Swedish Research Council and the Swedish Government Strategic Research Area in Materials Science on Advanced Functional Materials, AFM, at Linköping University.
Article: Two-dimensional materials by large-scale computations and chemical exfoliation of layered solids; Jonas Björk, Jie Zhou, Per O. Å. Persson and Johanna Rosen; Science 2024. Published online 15 March 2024. DOI: 10.1126/science.adj6556
Written by Anders Törneholm 
Source: Linköping University
You can offer your link to a page which is relevant to the topic of this post.
0 notes
20thcentutygeek · 2 months
Text
Folio Society Presents Batman!
(February 19, 2024) The Folio Society, independent publisher of beautifully illustrated hardback books, in collaboration with DC, will celebrate the 85th anniversary of the first comic book appearance of DC’s Dark Knight Detective with the release of DC: Batman. Created by Bob Kane with Bill Finger, Batman first appeared in 1939’s Detective Comics #27 and since then the Dark Knight has stood as a symbol of determination, courage and justice to generations of fans for over 80 years. Batman is one of the most iconic fictional characters in the world, and is a self-made Super Hero, notable not for his super powers, but for his intelligence, determination, and tech savvy.
This collectible compilation includes twelve seminal comics, by a host of iconic writers and artists— including Bill Finger, Bob Kane, Jerry Robinson, Denny O'Neil, Neal Adams, Marshall Rogers, Frank Miller, Dave Mazzucchelli, Alan Moore, Brian Bolland and Kelley Jones—all selected and introduced by former DC President, Publisher and Editor-in-Chief of DC, Jennette Kahn. Along with the 320-page one-of-a-kind deluxe book, DC: Batman also comes with a stand-alone replica copy of Batman #1. Scanned in its entirety from an original 1940 copy, the replica copy of the Batman #1 comic book, which includes the original back-up strips and vintage ads and introduces DC’s Clown Prince of Crime, aka The Joker, and The Cat, who would come to be known as Catwoman.
“Created towards the end of the Great Depression by artist Bob Kane with writer Bill Finger, Batman is an icon as familiar as James Bond or Tarzan, one who has evolved to reflect the changing attitudes of the twentieth century,” said Folio Society Head of Editorial, James Rose. “The stories selected for DC: Batman reveal how the character and his billionaire alter-ego Bruce Wayne gradually evolved from the dutiful crimefighter of the 1940s to a man possessed, as crazy as the criminals he puts away. The Caped Crusader faces a rogue’s gallery steeped in gothic horror, from the Weimar cinema-inspired The Joker to the Jekyll/Hyde figure of Two-Face and the Moriarty-like Ra's al Ghul.”
“Trauma is a through-line in the Batman mythology,” writes Jenette Kahn in her introduction. “It has made psychopaths of Batman’s foes and brought him to the edge of madness himself. Batman’s battle is not just against criminals and crime. He fears the day he’ll look into a mirror and see, not Bruce Wayne’s
face, but The Joker’s.”  The first woman at the helm of the legendary comic book publisher, Jenette Kahn  helped transform comics into a sophisticated art form during her 27-year tenure from 1976 to 2002.
“The Dark Knight Returns by Frank Miller and Batman: Year One, by Miller and Dave Mazzucchelli, and the terrifying classic The Killing Joke by Alan Moore and Brian Bolland are widely regarded as among the greatest comic books ever created,” said Folio Society Publishing Director, Tom Walker. “These stories changed the graphic medium forever with their combination of cinematic storytelling, shocking violence and literary depth and serve as centerpiece texts for DC: Batman.”
DC: Batman includes:
Facsimile: Batman #1 (Spring 1940)
Writer: Bill Finger
Cover artists: Bob Kane, Jerry Robinson
Artists: Bob Kane, Sheldon Moldoff
Editor: Whitney Ellsworth
The Bat-Man
Detective Comics #27 (May 1939) 
Writer: Bill Finger
Artist: Bob Kane Editor: Vincent Sullivan
Robin—the Boy Wonder
Detective Comics #38 (April 1940)
Writer: Bill Finger
Artists: Bob Kane, Jerry Robinson Editor: Whitney Ellsworth
The Crimes of Two-Face!
Detective Comics #66 (August 1942) 
Writer: Bill Finger
Artists: Jerry Robinson, George Roussos
Letterers: Ira Schnapp Editor: Whitney Ellsworth
Batman and Green Arrow: The Senator’s Been Shot!
The Brave and the Bold #85 (September 1969)
Writer: Bob Haney Cover artist: Neal Adams
Penciler: Neal Adams
Inker: Dick Giordano
Letterer: Ben Oda
Editor: Murray Boltinoff
Daughter of the Demon
Batman #232 (June 1971)
Writer: Dennis O'Neil
Cover artist: Neal Adams
Penciler: Neal Adams
Inker: Dick Giordano
Letterer: John Costanza
Editor: Julius Schwartz
The Dead Yet Live
Detective Comics #471 (August 1977)
Writer: Steve Englehart
Cover artists: Marshall Rogers, Terry Austin, Tatjana Wood, Gaspar Saladino
Penciler: Marshall Rogers
Inker: Terry Austin
Colorists: Marshall Rogers
Letterer: John Workman
Editors: Julius Schwartz, E. Nelson Bridwell
The Dark Knight Returns
Batman: The Dark Knight Returns #1 (June 1986)
Writer: Frank Miller
Cover artists: Frank Miller, Lynn Varley
Penciler: Frank Miller
Inker: Klaus Janson
Colorist: Lynn Varley
Letterer: John Costanza
Editors: Dick Giordano, Dennis O'Neil
Batman: Year One—Chapter One: Who I Am—How I Come to Be
Batman #404 (February 1987)
Writer: Frank Miller
Artist: Dave Mazzucchelli
Colorist: Richmond Lewis
Letterer: Todd Klein
Editor: Dennis O'Neil
Batman: The Killing Joke (July 1988)
Writer: Alan Moore
Cover artists: Brian Bolland, Richard Bruning
Artist: Brian Bolland
Colorist: John Higgins
Letterer: Richard Starkings
Editors: Dennis O'Neil, Dan Raspler
The Last Arkham (Part One)
Batman: Shadow of the Bat #1 (June 1992)
Writer: Alan Grant
Cover artist: Brian Stelfreeze
Penciler: Norm Breyfogle
Inker: Norm Breyfogle
Colorist: Adrienne Roy
Letterer: Todd Klein
Editors: Scott Peterson, Dennis O'Neil
Knightfall Part 1: Crossed Eyes and Dotty Teas
Batman #492 (May 1993)
Writer: Doug Moench
Cover artists: Kelley Jones, Bob LeRose
Penciler: Norm Breyfogle
Inker: Norm Breyfogle
Colorist Adrienne Roy
Letterer: Richard Starkings
Editors: Scott Peterson, Jordan B. Gorfinkel, Dennis O'Neil
The release of DC: Batman is the second release in the Folio Society publishing program with DC, following the release of the acclaimed DC: The Golden Age. DC: Batman has been made according to The Folio Society’s exceptional production standards. Scanned from original copies held in the DC archives, the comics have been reproduced in 10” x 7” treasury format. An anti-scratch laminated hardcover features Batman’s signature silhouette, with titles foil-embossed in yellow and midnight blue, the book itself cowled in a pitch-black slipcase bearing the famous Bat-Signal. A compendium of gothic artwork and Batarang-sharp storytelling, DC: Batman is an unmissable investigation into the adventures and pathology of one of the world’s most famous – and most troubled – DC Super Heroes. DC: Batman will be available from the Folio Society on February 20, 2024. The Folio Society edition of DC: Batman, selected and introduced by Jenette Kahn, will be available for £65 / US $100 on February 20, 2024 exclusively from https://www.foliosociety.com/usa/fiction/comics-graphic-novels.  
0 notes
home-decor-design · 3 months
Text
Holiday 2024 for early bookers: The best travel destinations in Europe that you should book now
Tumblr media
If you book early, you not only save money, but also benefit from many advantages. Are you looking for your next travel destination? Get inspired to plan your unforgettable vacation in 2024 now. Summer 2023 is almost over and since the unforgettable vacation of 2023 will soon be remembered, we are already thinking about our next travel destination. There's nothing more exciting than discovering new vacation destinations and experiencing what they have to offer. If you're still unsure where to spend your 2024 vacation, read on to find the best destinations. Table of contents - Vacation 2024: The best travel destinations you should book now - Spain - Italy - Türkiye - Greece - Vacation 2024: Croatia - Portugal
Vacation 2024: The best travel destinations you should book now
Tumblr media
Deciding where to spend your vacation time can be a difficult task, especially when traveling in Europe where the options for an unforgettable vacation are endless. Whether you're looking for a sun-drenched beach vacation, an enriching cultural experience or simply want to savor delicious cuisine, the following destinations have much to offer. Every single destination on the list is unique and will fill you with excitement. But don't wait until the last minute, as early bookers get the best deals and have the largest selection. Spain
Tumblr media
Spain is one of the most popular countries for holidays with its sunny Mediterranean climate. The country has some of the most beautiful beaches with a turquoise sea that contains all shades of blue and green. This is the reason why Spain is at the top of the most popular countries for a summer vacation in Europe. The quality of life, the healthy and cozy environment and the rich culture also speak for why the Lang is so famous among holidaymakers. Whether it's the Balearic Islands or the country's east coast, stretching from Costa Blanca to Costa de la Luz, you're sure to enjoy Spain. But Valencia, Alicante and Málaga also deserve a lot of attention. Italy
Tumblr media
The next country on the list of best summer destinations is Italy, which is also a popular choice for summer vacations with its clear skies and warm temperatures. From the rolling hills and vineyards of Tuscany to the dazzling Amalfi Coast to romantic cities like Rome, Florence and Venice and breathtaking islands like Sicily and Sardinia. There are so many beautiful places that will help you experience your dream vacation. Not to mention the many culinary experiences the country has to offer. Türkiye
Tumblr media
More and more holidaymakers are considering Turkey as their preferred travel destination and there are many reasons for this. Above all, the country has something to offer for everyone, whether you want a relaxing beach vacation or a unique adventure. With its crystal clear waters and soft white sand, Turkey is the perfect beach holiday destination. The country borders three seas, Aegean, Mediterranean and Black Sea, so you can choose places like Antalya, Marmaris and Bodrum for your ideal summer vacation. Its unique location between Europe and Asia has contributed greatly to its diverse cultural offerings. The delicious food and friendly locals will soon make you feel at home. Greece
Tumblr media
With its historic sites, incredible, endless coastline and over 200 beautiful islands, it's no wonder Greece is one of Europe's best summer destinations. The country offers a mild, Mediterranean climate and a long summer season. From famous destinations like Mykonos, Zakynthos and Santorini to quiet, authentic mountain villages, there is something to suit every type of traveler. Whether you want to soak up the sun on a quiet beach, explore the remains of an ancient civilization, or simply treat yourself to delicious food and Greek music, Greece has much to offer. Vacation 2024: Croatia
Tumblr media
With its bright blue sea and over 1000 islands, Croatia has become increasingly popular in recent years. And if you are a fan of Game of Thrones, this would be a special place to visit. Enjoy the ancient cities and the magnificent Adriatic coast so you can spend your vacation on some of the best beaches in the world. While chic bars, exciting nightlife and culinary experiences await all party lovers on the pretty island of Hvar, the island of Rab is very popular with families with its soft sand and shallow waters. Portugal
Tumblr media
With its golden beaches, cobalt blue waters and ocher cliffs, Portugal's southern coast is a true natural spectacle. You can expect long sunny days and a relaxed pace of life in the fantastic Algarve. Whether you want to lounge on the beautiful beaches, stroll through the old towns or enjoy the water parks, you're sure to find something that's right for you. Don't think about it for too long and grab the best early bird offers now to travel at a reduced price next summer. Read the full article
1 note · View note
esam12345 · 6 months
Photo
Tumblr media
(via Magazine Publishing)
Magazine Publishing
Magazine Publishing Report!
Legal Notice:- The author and publisher of this Ebook and the accompanying materials have used their best efforts in preparing this Ebook. The author and publisher make no representation or warranties with respect to the accuracy, applicability, fitness, or completeness of the contents of this Ebook. The information contained in this Ebook is strictly for educational purposes. Therefore, if you wish to apply ideas contained in this Ebook, you are taking full responsibility for your actions.
The author and publisher disclaim any warranties (express or implied), merchantability, or fitness for any particular purpose. The author and publisher shall in no event be held liable to any party for any direct, indirect, punitive, special, incidental or other consequential damages arising directly or indirectly from any use of this material, which is provided “as is”, and without warranties.
As always, the advice of a competent legal, tax, accounting or other professional should be sought. The author and publisher do not warrant the performance, effectiveness or applicability of any sites listed or linked to in this Ebook. All links are for information purposes only and are not warranted for content, accuracy or any other implied or explicit purpose.
Table of Contents
Chapter 1 – Selecting the Right Niche......................................................................... 3
Chapter 2 – Costs and Funding Options..................................................................... 7
Chapter 3 – Creating the Content............................................................................... 14
Chapter 4 – Finding Advertisers.................................................................................. 23
Chapter 5 – Alternative Revenue Sources for Your Magazine............................... 25
Chapter 6 – Printing...................................................................................................... 28
Chapter 7 – Getting Subscribers................................................................................. 31
0 notes
narrativestringtheory · 9 months
Text
Flying Toward a Twenty-First Century Aesthetics of Technomagic Girlhood
by Ravynn K. Stringfield
 What is Technomagic Girlhood?
When I began thinking, reading and writing about Black girl superheroes in my dissertation, I found I wanted a way to explore how characters like Riri Williams as Ironheart and Lunella Lafayette as Moon Girl were both performing fantastic feats while defining and creating their Black girlhood with the scientific, technological, and digital tools available to them. Their oft favorite feat? Flight. This list of characters includes but is in no way limited to: Shuri from Black Panther lore, Karen Beecher as Bumblebee, Lunella Lafayette as Moon Girl, and Max from Batman Beyond, the animated television show which ran on the Kids’ WB from 1999 to 2001. There are arguments to made for extending the category to include characters like Marvel Comics’ Misty Knight or even young Diana (Dee) Freeman from HBO Max’s Lovecraft Country, a speculative horror show adapted as a continuation of the 2016 Matt Ruff novel of the same name. In entering a conversation around Black superheroines that scholars like Sheena Howard, Deborah Whaley and Grace D. Gipson have nourished, technomagic girlhood became the term I used, as I was fascinated by the way innovative digital practices and self-making become intertwined for Black girls in superhero stories where our current reality and its technologies were recognizable, but where these girls could manipulate technology to give themselves the ability to literally (and metaphorically) fly.
The idea of technomagic girlhood draws energy from a number of related terms, primary among them being Afrofuturism, the artistic/aesthetic movement and critical framework around the relationship of folks of the African diaspora to the future, technology, and questions of liberation.1 Technomagic girlhood sits underneath the large Afrofuturistic umbrella, though it takes as its large focal point the fantasy genre, magic, the unexplained, whereas much of the strongest Afrofuturistic theorizing prioritizes science fiction as a genre. Work is being done amongst scholars all over to push the boundaries of what constitutes Afrofuturism, and what is in conversation with it.
Also related is Moya Bailey’s term, digital alchemy, which she uses in Misogynoir Transformed to refer to “the ways that women of color, Black women, and Black nonbinary, agender, and gender-variant folks in particular transform everyday digital media into valuable social justice media that recode the failed scripts that negatively impact their lives” (24). Hashtags under the work of Black women, Black queer folks and Black gender expansive folks become entire movements, with “alchemy” implying a chemistry. The chemistry of it all denotes a type of a work, rather than the social justice media appearing as if by will alone, and not backed by the labor of Black women and femmes. I prefer technomagic rather than alchemy because magic connotes, for some a discipline, but contains a joy of use as well. Bailey continues: “Digital alchemy shifts our attention from the negative impact stereotypes in digital culture to the redefinition of representations Black women are creating that provide another way of viewing their worlds” (24). There is a joy in learning to manipulate science, technology and the digital to your own ends for experiments in redefinition and self-making in technomagic girlhood.
For this playful turn, I draw from digital ethnomusicologist Kyra D. Gaunt’s work on embodied play and Black girlhood. I also use “magic” because it locates me more clearly in a legacy of the Black speculative, the Black fantastic, longer traditions of Black girls in magic—and accounts more clearly for how it is possible for these girls to fly. But to fly does not always mean “magic” is afoot. Flight is a condition of reality in texts such as Virginia Hamilton’s retelling of African American folk tales, The People Could Fly (1985), in which Africans take flight back home, and Toni Morrison’s critically acclaimed novel, Song of Solomon (1977), in which Pilate takes to the air. The “techno” prefix is inspired in part by film scholar Anna Everett’s work on Black technophilia and draws us more toward a legacy of Black participation in technology and the digital.
As in #BlackGirlMagic, a commonplace example of technomagic girlhood practice to me, the magic and the fantastic are deeply rooted in reality. This hashtag originates with CaShawn Thompson, who in an interview with journalist and author Feminista Jones says: “I was the first person to use Black Girl Magic or Black Girls Are Magic in the realm of uplifting Black women. Not so much about our aesthetic but jut who we are.” Our magic, Thompson argues, is simply the truth; it was true of her everyday life and how she experienced the world. There is nothing speculative about it, and simply and uniquely of Black girlhood. In many ways, Thompson’s understanding of Black Girl Magic is in conversation with how I understand technomagic girlhood and the potential of what it could be.
I specifically came to use “technomagic” when writing about Marvel Comics’ teenage superhero Riri Williams, also known as Ironheart. The young Chicagoan was able to create her own version of Tony Stark’s Iron Man suit, making her supergenius hypervisible on a large scale, but which she uses locally to help her community. Afrofuturism was the term that I had used for a long time in my work on her, but when we are first introduced to Riri in Eve L. Ewing and Luciano Vecchio’s run, this Black girl, in her tech suit by which she has engineered herself the ability to fly, her face turned skyward, reveling in joy and legacy as seen below…something else occurring. Something that needed to center Riri’s Black girlhood, her experimental and creative self-making through technology, and the joy of impossibility now made tangible…2
Technomagic girlhood is in part a response to some of the questions that André Brock, Jr. asks in Distributed Blackness about whether or not his work on technology and the internet is Afrofuturistic: what about the digital present? Afrofuturism, Brock argues, “is rightly understood as a cultural theory about Black folks’ relationship to technology, but its futurist perspective lends it a utopian stance that doesn’t do much to advance our understanding of what Black folk are doing now” (15). In considering the “now” in possibilities of Black technophilia, technomagic was where I had space to spread out and play as Riri does, as many of the contemporary Black girls do, informed deeply by the legacies and lineages that have come before.
 Cover Girls
I choose to examine here Riri Williams as the catalyst for my interest in the topic, along with DC Comics’ Natasha Irons. In what follows, I address these characters’ relationship to technomagic, as seen in the covers for the collected edition of Ironheart: Meant to Fly (Marvel, 2020) and Action Comics #1054 (DC Comics, 2023) that exemplify a few core characteristics of a visual aesthetics of technomagic girlhood and work in tandem.
Technomagic describes a particular quality of contemporary Black girlhood, expansively defined. While this idea most certainly can be applied to other groups of people, I use it as a way of understanding the relationship Black girls in superhero media and other fantasy narratives have to science, technology and digital media, to creativity and joy, and to self-making. By Black girlhood, I often think of how Aria S. Halliday and the authors of the Black Girlhood Studies Collection interrogate the ways in which society tends to conflate Black girlhood and Black womanhood, in both seemingly innocuous and explicitly dangerous ways. Black girls’ joy practices are central to education scholar Ruth Nicole Brown’s work and are resonant here when viewing Riri in Ironheart #1, skyward facing, heart open and Natasha’s focused joy on the cover shown below.
To remember that Black girlhood can be expansive, it is important to incorporate writers who consider girlhood to be a state of mind and being, rather than exclusively an age range. Digital ethnomusicologist Kyra D. Gaunt, for example, asks readers to engage questions of girlhood that include women who might begin their intimate stories to each other with a resonant, “Giiiirl” (p. 2). And Moya Bailey urges readers to consider a wider breadth of possible people who might be brought in by widening what we consider womanhood in her book Misogynoir Transformed: Black Women’s Digital Resistance—in particular, she argues that more than cisgendered heterosexual Black women are harmed by misogynoir (p. 18-22). This is relevant for Natasha (above right), who is canonically a lesbian in the comics, and whose cover brings to mind the colors of the bisexual flag: pink, purple and blue.
After the primary condition of Black girlhood is established, there are secondary conditions that are present in an aesthetics of technomagic girlhood. These include elements of:  
impossibility, whether feats or conditions;
creativity, ingenuity, or innovation, often expressed as a practice of the girl in question;
technology, science, or digital media;
self-making or alter-ego creation; and
unbridled joy.
The magic emerges from the clear masterful manipulation of most of these elements in a playful fashion, often for heroic ends, though regularly for their own enjoyment as well.
When we look at these two covers together, we can see elements from many of these categories. On Riri’s cover (by Amy Reeder for Ironheart #43), we see a young Black girl who has presumably engineered herself the ability to fly—impossibility—but who, in this instance, is now falling. Riri falls downward and, judging by the surprise on her face, it appears that the Ironheart suit she has created for herself has fallen apart. It calls to mind the image of the Greek myth of Daedalus and Icarus, though Riri is both: she is both the famed inventor and also the child who maybe has flown too close to the sun. Riri’s relationship to this myth calls to mind the ingenuity and technology inherent to technomagic girlhood. But the image is juxtaposed with the title phrase “meant to fly”—so, perhaps it is that Riri’s suit is coming to her, not away from her, to save her, to enable her flight, because that impossible feat is what she deserves, and she knows it is hers. She created the impossible and trusts in her own ability—self-making. Notably, in issue #1, Riri can find who she is within the suit, and within a larger legacy not just of superheroics, but of Black women who made her possible. This is both self-making and joy.
DC Comics’ Black girl science genius, Natasha Irons, has a longer history than Riri Williams. Where Riri’s origins date back to the Invincible Iron Man run in 2016 (Vol. 3 #7) written by Brian Michael Bendis and illustrated by Mike Deodato, Natasha Irons was introduced as Dr. John Henry Irons’ precocious niece in Steel #1 (February 1994), written by Jon Bogdanove and Louise Simonson, with art by Chris Batista and Rich Faber. While Irons earns his claim to fame by filling in for Superman, going on to becoming a hero in his own right, over time Natasha develops an aptitude for science as she hangs around her uncle, eventually proving adept at working on Irons’ suit and going on to develop her own. Natasha’s heroism in her own right has only deepened with time. With a new Steelworks run beginning in the summer of 2023, there have been opportunities for Natasha fans to get excited. Most recently, Action Comics #1054 had a variant cover (1:25) by Milestone Initiative artist Yasmín Flores Montañez featuring a solo Natasha in a similar vein to the iconic Riri “Meant to Fly” cover (shown above).4
On this cover, Natasha more clearly appears to be attracting the pieces of her suit to her as she leans over, possibly suspended in air—similar to some iconic scenes from the Marvel Cinematic Universe Iron Man films—anchored by a neon pink background, with a touch of blue, guiding viewer to think more critically about our gendered assumptions regarding technology and science. There’s a look of satisfaction on her face—this is where she is meant to be. She clearly wears the crest of the House of El—Superman’s iconic “S”—as a symbol of hope, though perhaps this will mean something different for Natasha in the issues to come. Natasha’s relationship to this technology, her ability to manipulate it, will inevitability lead to some creative self-making in relationship to this iconic symbol and who she is within in—and without it.
Optimally, technomagic girlhood does not prioritize a capitalistic notion of the lone Black girl science genius. It is not simply “Black Girls Code” for a means to an end. There must be a fantastic joy to it, enabling the Black girl in question to just be, to simply exist, to feel confident in exploring her sense of self, to experiment in self-making. It is not the entering into science and technology spaces to perpetuate capitalistic ideas of productivity or advancement, but for joy and exploration of the self. Therefore, those who care about the well-being of Black girls—all children—must work toward communal needs being met.
In order for this to be meaningful, it needs to be communal, or in relation to others, as it is portrayed in Eve L. Ewing’s twelve-issue Riri Williams: Ironheart run (2018-2019). The idea of the lone Black girl genius feeds into harmful stereotypes related to the magical Negro; instead, intelligence can, and should be, nurtured in community. In Ewing’s Ironheart, Riri’s mother is a loving and watchful presence. Xavier King is Riri’s friend in the series, not just a teammate as many of the other supers she encounters in other runs are. Xavier cares about Riri as a person, with no real investment in what she can offer him. Those who participate in technomagic girlhood are still, after all, girls—children—and still need to love and be loved. 
Ewing’s Ironheart gives Riri something she hasn’t had until that point: space to be. We should be working towards these girls’ ability to just be. The ability to create and play in these spaces is contingent on safety. Though Black girl will continue to create and play in spite of oppressive systems, it does not mean these systems as constructed are just. What will it mean for technomagic girlhood to not just be reactive, but to be generative?5 What will it mean for technomagic girlhood to embrace Afrofuturism in so far as it connects to questions of abolition, which devalues the role of policing and commits to a politics of care, as we seek to imagine new and better worlds for Black people, especially children?6 By this I mean: when safety and care are prioritized, what new worlds might our Black girls imagine with their newfound access to digital tools?
 Conclusion
With technology, science, and digital media as the backdrop of our era, Black girls who engage in technomagic are increasingly enabled. They are the girls in fantasy stories who may not be gifted with an inexplicable gift for controlling the weather or who can speak to animals, but who have a technophilia akin to magic. They make their ordinary lives extraordinary with their ability to manipulate and build their sense of self in the process. Here, I’ve examined technomagic in superhero narratives, but the principles can and likely will apply across different types of speculative media where Black girls have unique relationships to science, technology and digital media. In particular, these girls are often seen more widely in comic stories adapted for screen: most folks met Riri Williams for the first time on screen in Wakanda Forever (dir. Ryan Coogler, 2022), the sequel film to Black Panther (dir. Ryan Coogler, 2018).
While the general connotation of this term slants towards positivity, as does a related popular phrase like Black Girl Magic, or the hashtagged version: #BlackGirlMagic, it’s worth approaching it with a touch of skepticism and several doses of care. Technomagic, while it does align us with the idea of the Black girl science genius, can also perpetuate the trope of the solitary genius, an idea which Ironheart writer, Eve L. Ewing, problematizes in a 2021 interview with Catapult: “…If that [the trope of the Black girl STEM superhero] becomes the only mode through which we see Black girls, that’s also a problem… I love Ironheart, I love Riri, but Shuri and Riri and Moon Girl are all science geniuses, you know? How does that reinforce certain limited notions about what Black intelligence or Black genius has to look like? How does that play into capitalist-driven conversation about Black girls in coding or Black girls’ participation in science fields?”
To Ewing’s inquiry and to Bailey’s assertions that digital alchemy helps us think about the possible ways Black women are redefining and rethinking about themselves, technomagic girlhood might offer one potential answer. Where we are able to keep joy practices, build and form community together, and experiment in self-making, we protect the essence of technomagic girlhood.7
Notes
1 The term “Afrofuturism” was originally coined in the 1994 roundtable essay “Black to the Future: Interviews with Samuel R. Delany, Greg Tate and Tricia Rose” by cultural critic Mark Dery in Flame Wars: The Discourse of Cyberculture. It is noted in the essay that Afrofuturism, both as an aesthetic and as a critical framework, has a much longer history, including origins that are often thought of as musical, thinking about the contributions of experimental musicians such as Sun Ra.
2 In this panel, Ewing invokes the legacy of Maya Angelou's poem “Still I Rise” (1978). The entire stanza reads:
Leaving behind nights of terror and fear I rise Into a daybreak that’s wondrously clear I rise Bringing the gifts that my ancestors gave, I am the dream and the hope of the slave. I rise I rise I rise.
3 Notably for this essay, Reeder is also known for her artwork on Moon Girl and Devil Dinosaur.
4 Milestone Media was an African American centric superhero comics publishing company founded in 1993 by Dwayne McDuffie, Denys Cowan, Michael Davis and Derek T. Dingle. DC Comics is currently relaunching Milestone and reintroducing its characters by bringing in a class of artists and writers specifically dedicated to the mission of Milestone. Flores Montañez is part of the Milestone Initiative’s inaugural class.
5 This question is in the spirit of Moya Bailey, whose work and distinction between generative and defensive alchemy as one which is creative for the community and one which is responsive to hatred. It is my hope that technomagic girlhood is framed similar to a generative digital alchemy.
6 I think here of the necessary and timely work of abolitionist organizers and writers Kelly Hayes and Mariame Kaba in their new book Let This Radicalize You: Organizing and the Revolution of Reciprocal Care (2023).
7 Gratitude: Many thanks to early readers of this piece for offering kind words and useful insights: Vanessa Anyanso, Shira Greer, Dr. Autumn A. Griffin, Dr. Jordan Henley, Grace B. McGowan, Kristen Reynolds and Dr. Justin Wigard. Conversations with KàLyn Banks Coghill and Dr. Francesca Lyn were also invaluable. Though they are not cited here, the scholarship of education scholars Drs. Ebony Elizabeth Thomas and S. R. Tolliver remain deeply influential to how I think and write. I would like to thank Dr. Shawn Gilmore for his careful editorial eye.
Works Cited
Bailey, Moya. Misogynoir Transformed: Black Women’s Digital Resistance. New York University Press (2021).
Bogdanove, Jon and Louise Simonson. Steel #1. DC Comics (1994).
Brock, André. Distributed Blackness: African American Cybercultures. New York University Press, (2020).
Everett, Anna. “On Cyberfeminism and Cyberwomanism: High-Tech Mediations of Feminism’s Discontents.” Signs (Vol. 30, No. 1).
Ewing, Eve L. & Luciano Vecchio. Riri Williams: Ironheart #1-12. Marvel Comics (2018-2019).
Ewing, Eve L. & Luciano Vecchio. Riri Williams: Ironheart: Meant to Fly. Marvel Comics (2020).
Gaunt, Kyra D. The Games Black Girls Play: Learning the Ropes from Double Dutch to Hip-Hop. New York University Press (2006).
Halliday, Aria S. ed. The Black Girlhood Studies Collection. Women’s Press, CSP (2019).
Jones, Feminista. “For CaShwawn Thompson, Black Girl Magic Was Always the Truth,” Beacon Broadside (2019). https://www.beaconbroadside.com/broadside/2019/02/for-cashawn-thompson-black-girl-magic-was-always-the-truth.html
Montañez, Yasmín Flores. Action Comics #1054, 1:25 Variant Cover. DC Comics (2023).
Stringfield, Ravynn. “How Eve L. Ewing Makes Her Stories Fly,” Catapult Magazine, May 19, 2021. https://catapult.co/dont-write-alone/stories/interview-with-dr-eve-ewing-by-ravynn-stringfield
1 note · View note
prabhatprakashan12 · 9 months
Video
youtube
Amazon.in : song of the trinity prabhat
While his first book, ‘Shatru, a prequel to ‘Song of the Trinity, a fantasy series, is a thriller based on the Puranas, Upanishads, and Vedas. His second book, ‘Agniputr, is based on a single stanza from Yajurveda. His third book, ‘Fear of God; deals with corruption and vigilantism. His fourth book, ‘The Vimana Transcripts, is a thriller based on ancient Indian temples and is rated as one of the five top thrillers of 2021 by the Indian Booktuber buyNow:https://amzn.eu/d/jcyVL39 viedo link :https://studio.youtube.com/video/TVQzTntHkLg/edit
0 notes
outragedtortilla · 9 months
Quote
Come, Boy, sit down. Sit down and rest." And the boy did. And the tree was happy.
#quotes
1 note · View note
couponawk · 10 months
Text
How to Save More With My Social Book Coupons?
Table of Contents
1. How to Use My Social Book Promo Codes Get Lower Price?
2. How About Their Black Friday?
3. Does My Social Book Have Any Cyber Monday Promotions?
4. What’s Their Return and Exchange Policy?
5. What are the Benefits of Becoming a Member?
6. How about Refer a Friend offer?
If you are looking for some memorable gifts, then you must seriously consider My Social Book. My Social Book‘s main business is unique print products created from your Facebook and Instagram pictures. They will automatically capture your life from your Facebook and Instagram and turn it into a rich photo album or other wall decoration. This is indeed a gift that is both fun and hard to forget. You might think that this service and product will be so expensive that you can’t afford it, but the fact is that they will provide you with a lot of My Social Book coupons every day to meet your expectations. So let’s talk about how to get the best product at the most affordable price. 
1. How to Use My Social Book Promo Codes Get Lower Price?
I know that you should really want to buy your favorite products while saving money, then you must need these coupons. How do you get My Social Book discount codes? The key to getting a discount is the couponawk.com website, as all kinds of coupons are displayed here. First, you need to put the product you want to buy into the shopping cart and go to the payment interface. At the same time, open couponawk.com to find the corresponding coupon, copy and paste its code to the payment interface, and you can see the discounted price. In short, the official website of My Social Book and couponawk.com will not let you down. 
2. How About Their Black Friday?
It is well known that Black Friday is the most exciting shopping day among all the shopping activities. It has become a well-known promotional festival, and many people look forward to its arrival every year. My Social Book participates in Black Friday promotions every year. On that day, they will issue richer coupon codes than usual, so that customers can purchase long-awaited products without worry. Therefore, if you already know enough about them and are interested in their products, you’d better bookmark their official website or couponawk.com in advance to get the latest news of Black Friday. 
3. Does My Social Book Have Any Cyber Monday Promotions?
Of course, they also have Cyber Monday promotions. As sellers, they attach great importance to customer experience, and they hope that each customer can buy a satisfactory product at a reasonable price. So after Black Friday every year, they will continue to launch a wealth of promotional activities for Cyber Monday. This is not only in accordance with the development trend of the commodity society, but also to provide convenience to consumers within their ability. Therefore, you can follow their Cyber Monday activities so as not to miss the voucher codes you need. 
4. What’s Their Return and Exchange Policy?
Shopping online will inevitably encounter the situation of return or exchange. The return and exchange policy is an indispensable policy for merchants because it is not only an important factor for protecting consumers’ rights but also protecting the legitimate interests of merchants. My Social Book provides you with a return and exchange service within 30 days for whatever reason. After receiving and checking your return, they will send you an email notifying you that they have received the return and refund your payment. 
5. What are the Benefits of Becoming a Member?
As mentioned earlier, My Social Book pays attention to consumers’ feelings and shopping experience, so they have established a membership system. If you become their member, they will do their best to provide you with the best service, including the latest product news notifications, preferential prices, and secret codes. You can go to their official website to learn more about the members, or you can immediately become their partner to take advantage of the great opportunity. By the way, you can also focus on My Social Book’s social media like Facebook or Twitter because they often release some offers on their social media platform.
6. How about Refer a Friend offer?
Join My Social Book Referral Program now! Once you refer your friends to purchase at their online store, you can earn points for further shopping. You can use the points to exchange for discount coupons! Besides, both you and the friend you refer to shopping on My Social Book will get exclusive discounts, which is also a very easy way to save money for both of you.
As the industry grows, My Social Book promises to mine valuable pictures for their customers and provide them with affordable, high-quality products. It’s not just a photo book, it’s a personal yearbook that tells the story of your life. The product is completely unique and they are the only company selling it. In addition, their large number of My Social Book coupons that you can get from couponawk.com is enough to make you feel good. Why not try it once? 
  https://linkcoupon.wordpress.com/2023/06/09/how-to-save-more-with-my-social-book-coupons/
0 notes
grouchydairy · 10 months
Quote
Come, Boy, sit down. Sit down and rest." And the boy did. And the tree was happy.
#quotes
0 notes
teenageread · 10 months
Text
Review: Swimming in the Sea of Stars
Tumblr media
Synopsis:
Journal entry: Heading to school. I know what everyone will say.  There goes the girl who tried to kill herself.
Addison is no stranger to feeling stressed, insecure, and sad. Her therapist recommended she keep a journal to help her understand those feelings better, which she really needs today. It’s her first day back to school, several weeks after she survived her suicide attempt. She knows there are rumors about why she did it: A lousy home life? Bullying? Heartbreak? None of them are true, but it doesn’t matter because Addison still feels like she’s drowning. She still holds secrets she’s not ready to share.
During the school day, Addison encounters four other students struggling with their own secrets:
Cooper is anxious about seeing Addison. They were sort of a couple until he tried to kiss her. She fled and then tried to end her life. Those two things couldn’t be related, could they?
Celia feels trapped by her mother’s abusive boyfriend. She can guess why Addison did what she did.
Damion is TikTok-famous and thinks befriending Addison could boost his followers. But what no one knows is he needs the world to remember him since his sick mom doesn’t anymore.
Avery is considered a loner and doesn’t know Addison, but they have neighboring lockers. With Avery’s older brother in jail for dealing drugs, Avery is desperate for meaningful human connection.
Plot:
A month after she attempted suicide, Addison found herself going back to her high school, trying to get her life back on track. Keeping journal entries like her therapist recommended, Addison recounts her first day back to high school and talks about the people she meets, the people she tries to avoid, and her general uneasy feeling of being gawked at. Where the gawking does happen, Addison finds herself interacting with others who like her, have bigger things going on than just high school. In the washroom she meets Celia, someone who is being abused by her mother’s boyfriend, and has to sneak away to the safety of school each day. In the halls, she meets Damion, a TikTok famous influencer who is desperate for public acknowledgment of his existence because he does not get it at home. Damion is also connected to Avery, an old childhood friend who needs Damion for a favor as she tries to put her family back together through the addition of a new member. Addison also runs into Booker, despite trying to avoid him, her ex-almost-boyfriend who doesn't understand why Addison tries to end her life after their almost kiss. What's worse is that when Addison tried to end her life, Booker’s best friend and cousin is fighting for his life as his cancer diagnosis indicates that hope might not cut it. Taken within a day, these five teens' lives are irreversibly changed as they are all interconnected and interact in what some might consider a normal day, but for these teens, it is a start of a whole new chapter.  
Thoughts:
Julie Wrights writes a wowzer novel, taken from 5 teenage perspectives in a day, Wright's novel emphasizes mental health in a realness and hopefulness perspective. Each of our teens is dealing with their own significant problems - Celia / Damion with their mom, Addison / Booker with potential death, and Avery’s trying to put their family back together. With so much happening, this makes the novel fly through your hands as you are eager to keep reading to get back to the character’s plotline you like the most. Although from a third-person perspective, Wrights really makes you be able to feel each teen’s struggles and emotions as they go through their challenge, relying on others in the novel for help. Despite a big cast of characters, Wrights makes it easy to figure out whose storyline each chapter is by stating their name, along with including a diary entry from Addison. That’s right, despite seeming to be the main character, we do not get to follow Addison around like the others, instead seeing how she interacts through her diary entry, and how other characters talk to her. My main complaint about this novel is that the timeline, just one day, seems so short that some storylines, like Celia, do not feel as flushed out as others like Avery. While Wright does solve some characters' issues, like Celia’s despite feeling rushed, others like Damion do not feel as solved as Wright could have made it. The story just felt rushed and needed more time for our characters to sort themselves out, but that would also ruin the magic of how Addison’s first day back changed these lives with just a small and simple interaction. Overall, it was a stellar read, and an important one as Wright really dives into the mental health problems our teenagers, both in this novel and real life, face, and how to empower them to get through these hard times and see the light, and the beauty of staying.  
Read more reviews: Goodreads
Buy the book: Amazon
1 note · View note
jcmarchi · 28 days
Text
Mamba Explained
New Post has been published on https://thedigitalinsider.com/mamba-explained/
Mamba Explained
The State Space Model taking on Transformers
Right now, AI is eating the world.
And by AI, I mean Transformers. Practically all the big breakthroughs in AI over the last few years are due to Transformers.
Mamba, however, is one of an alternative class of models called State Space Models (SSMs). Importantly, for the first time, Mamba promises similar performance (and crucially similar scaling laws) as the Transformer whilst being feasible at long sequence lengths (say 1 million tokens). To achieve this long context, the Mamba authors remove the “quadratic bottleneck” in the Attention Mechanism. Mamba also runs fast – like “up to 5x faster than Transformer fast”1.
Mamba performs similarly (or slightly better than) other Language Models on The Pile (source)
Gu and Dao, the Mamba authors write:
Mamba enjoys fast inference and linear scaling in sequence length, and its performance improves on real data up to million-length sequences. As a general sequence model backbone, Mamba achieves state-of-the-art performance across several modalities such as language, audio, and genomics. On language modelling, our Mamba-3B model outperforms Transformers of the same size and matches Transformers twice its size, both in pretraining and downstream evaluation.
Here we’ll discuss:
The advantages (and disadvantages) of Mamba (🐍) vs Transformers (🤖),
Analogies and intuitions for thinking about Mamba, and
What Mamba means for Interpretability, AI Safety and Applications.
Problems with Transformers – Maybe Attention Isn’t All You Need
We’re very much in the Transformer-era of history. ML used to be about detecting cats and dogs. Now, with Transformers, we’re generating human-like poetry, coding better than the median competitive programmer, and solving the protein folding problem.
But Transformers have one core problem. In a transformer, every token can look back at every previous token when making predictions. For this lookback, we cache detailed information about each token in the so-called KV cache.
When using the Attention Mechanism, information from all previous tokens can be passed to the current token
This pairwise communication means a forward pass is O(n²) time complexity in training (the dreaded quadratic bottleneck), and each new token generated autoregressively takes O(n) time. In other words, as the context size increases, the model gets slower.
To add insult to injury, storing this key-value (KV) cache requires O(n) space.  Consequently, the dreaded CUDA out-of-memory (OOM) error becomes a significant threat as the memory footprint expands. If space were the only concern, we might consider adding more GPUs; however, with latency increasing quadratically, simply adding more compute might not be a viable solution.
On the margin, we can mitigate the quadratic bottleneck with techniques like Sliding Window Attention or clever CUDA optimisations like FlashAttention. But ultimately, for super long context windows (like a chatbot which remembers every conversation you’ve shared), we need a different approach.
Foundation Model Backbones
Fundamentally, all good ML architecture backbones have components for two important operations:
Communication between tokens
Computation within a token
The Transformer Block
In transformers, this is Attention (communication) and MLPs (computation). We improve transformers by optimising these two operations2.
We would like to substitute the Attention component3 with an alternative mechanism for facilitating inter-token communication. Specifically, Mamba employs a Control Theory-inspired State Space Model, or SSM, for Communication purposes while retaining Multilayer Perceptron (MLP)-style projections for Computation.
The Mamba Block
Like a Transformer made up of stacked transformer blocks, Mamba is made up of stacked Mamba blocks as above.
We would like to understand and motivate the choice of the SSM for sequence transformations.
Motivating Mamba – A Throwback to Temple Run
Imagine we’re building a Temple Run agent4. It chooses if the runner should move left or right at any time.
To successfully pick the correct direction, we need information about our surroundings. Let’s call the collection of relevant information the state. Here the state likely includes your current position and velocity, the position of the nearest obstacle, weather conditions, etc.
Claim 1: if you know the current state of the world and how the world is evolving, then you can use this to determine the direction to move.
Note that you don’t need to look at the whole screen all the time. You can figure out what will happen to most of the screen by noting that as you run, the obstacles move down the screen. You only need to look at the top of the screen to understand the new information and then simulate the rest.
This lends itself to a natural formulation. Let h be the hidden state, relevant knowledge about the world. Also let x be the input, the observation that you get each time. h’ then represents the derivative of the hidden state, i.e. how the state is evolving. We’re trying to predict y, the optimal next move (right or left).
Now, Claim 1 states that from the hidden state h, h’, and the new observation x, you can figure out y.
More concretely, h, the state, can be represented as a differential equation (Eq 1a):
$h’(t) = mathbfAh(t) + mathbfBx(t)$
Knowing h allows you to determine your next move y (Eq 1b):
$y(t) = mathbfCh(t) + mathbfDx(t)$
The system’s evolution is determined by its current state and newly acquired observations. A small new observation is enough, as the majority of the state can be inferred by applying known state dynamics to its previous state. That is, most of the screen isn’t new, it’s just a continuation of the previous state’s natural downward trajectory. A full understanding of the state would enable optimal selection of the subsequent action, denoted as y.
You can learn a lot about the system dynamics by observing the top of the screen. For instance, increased velocity of this upper section suggests an acceleration of the rest of the screen as well, so we can infer that the game is speeding up5. In this way, even if we start off knowing nothing about the game and only have limited observations, it becomes possible to gain a holistic understanding of the screen dynamics fairly rapidly.
What’s the State?
Here, state refers to the variables that, when combined with the input variables, fully determine the future system behaviour. In theory, once we have the state, there’s nothing else we need to know about the past to predict the future. With this choice of state, the system is converted to a Markov Decision Process. Ideally, the state is a fairly small amount of information which captures the essential properties of the system. That is, the state is a compression of the past6.
Discretisation – How To Deal With Living in a Quantised World
Okay, great! So, given some state and input observation, we have an autoregressive-style system to determine the next action. Amazing!
In practice though, there’s a little snag here. We’re modelling time as continuous. But in real life, we get new inputs and take new actions at discrete time steps7.
We would like to convert this continuous-time differential equation into a discrete-time difference equation. This conversion process is known as discretisation. Discretisation is a well-studied problem in the literature. Mamba uses the Zero-Order Hold (ZOH) discretisation8. To give an idea of what’s happening morally, consider a naive first-order approximation9.
From Equation 1a, we have
$h’(t) = mathbfAh(t) + mathbfBx(t)$
And for small ∆,
$h’(t) approx frach(t+Delta) – h(t)Delta$
by the definition of the derivative.
We let:
$h_t = h(t)$
and
$h_t+1 = h(t + Delta)$
and substitute into Equation 1a giving:
$h_t+1 – h_t approx Delta (mathbfAh_t + mathbfBx_t)$ $Rightarrow h_t+1 approx (I + Delta mathbfA)h_t + (Delta mathbfB)x_t$
Hence, after renaming the coefficients and relabelling indices, we have the discrete representations:
The Discretised Version of the SSM Equation
If you’ve ever looked at an RNN before10 and this feels familiar – trust your instincts:
We have some input x, which is combined with the previous hidden state by some transform to give the new hidden state. Then we use the hidden state to calculate the output at each time step.
Understanding the SSM Matrices
Now, we can interpret the A, B, C, D matrices more intuitively:
A is the transition state matrix. It shows how you transition the current state into the next state. It asks “How should I forget the less relevant parts of the state over time?”
B is mapping the new input into the state, asking “What part of my new input should I remember?”11
C is mapping the state to the output of the SSM. It asks, “How can I use the state to make a good next prediction?”12
D is how the new input passes through to the output. It’s a kind of modified skip connection that asks “How can I use the new input in my prediction?”
Visual Representation of The SSM Equations
Additionally, ∆ has a nice interpretation – it’s the step size, or what we might call the linger time or the dwell time. For large ∆, you focus more on that token; for small ∆, you skip past the token immediately and don’t include it much in the next state.
(source)
And that’s it! That’s the SSM, our ~drop-in replacement for Attention (Communication) in the Mamba block. The Computation in the Mamba architecture comes from regular linear projections, non-linearities, and local convolutions.
Okay great, that’s the theory – but does this work? Well…
Effectiveness vs Efficiency: Attention is Focus, Selectivity is Prioritisation
At WWDC ‘97, Steve Jobs famously noted that “focusing is about saying no”. Focus is ruthless prioritisation. It’s common to think about Attention positively as choosing what to notice. In the Steve Jobs sense, we might instead frame Attention negatively as choosing what to discard.
There’s a classic intuition pump in Machine Learning known as the Cocktail Party Problem13. Imagine a party with dozens of simultaneous loud conversations:
Question:
How do we recognise what one person is saying when others are talking at the same time?14
Answer:
The brain solves this problem by focusing your “attention” on a particular stimulus and hence drowning out all other sounds as much as possible.
Transformers use Dot-Product Attention to focus on the most relevant tokens. A big reason Attention is so great is that you have the potential to look back at everything that ever happened in its context. This is like photographic memory when done right.15
Transformers (🤖) are extremely effective. But they aren’t very efficient. They store everything from the past so that they can look back at tokens with theoretically perfect recall.
Traditional RNNs (🔁) are the opposite – they forget a lot, only recalling a small amount in their hidden state and discarding the rest. They are very efficient – their state is small. Yet they are less effective as discarded information cannot be recovered.
We’d like something closer to the Pareto frontier of the effectiveness/efficiency tradeoff. Something that’s more effective than traditional RNNs and more efficient than transformers.
The Mamba Architecture seems to offer a solution which pushes out the Pareto frontier of effectiveness/efficiency.
SSMs are as efficient as RNNs, but we might wonder how effective they are. After all, it seems like they would have a hard time discarding only unnecessary information and keeping everything relevant. If each token is being processed the same way, applying the same A and B matrices as if in a factory assembly line for tokens, there is no context-dependence. We would like the forgetting and remembering matrices (A and B respectively) to vary and dynamically adapt to inputs.
The Selection Mechanism
Selectivity allows each token to be transformed into the state in a way that is unique to its own needs. Selectivity is what takes us from vanilla SSM models (applying the same A (forgetting) and B (remembering) matrices to every input) to Mamba, the Selective State Space Model.
In regular SSMs, A, B, C and D are learned matrices – that is
$mathbfA = mathbfA_theta$ etc. (where θ represents the learned parameters)
With the Selection Mechanism in Mamba, A, B, C and D are also functions of x. That is $mathbfA = mathbfA_theta(x)$ etc; the matrices are context dependent rather than static.
Mamba (right) differs from traditional SSMs by allowing A,B,C matrices to be selective i.e. context dependent (source)
Making A and B functions of x allows us to get the best of both worlds:
We’re selective about what we include in the state, which improves effectiveness vs traditional SSMs.
Yet, since the state size is bounded, we improve on efficiency relative to the Transformer. We have O(1), not O(n) space and O(n) not O(n²) time requirements.
The Mamba paper authors write:
The efficiency vs. effectiveness tradeoff of sequence models is characterized by how well they compress their state: efficient models must have a small state, while effective models must have a state that contains all necessary information from the context. In turn, we propose that a fundamental principle for building sequence models is selectivity: or the context-aware ability to focus on or filter out inputs into a sequential state. In particular, a selection mechanism controls how information propagates or interacts along the sequence dimension.
Humans (mostly) don’t have photographic memory for everything they experience within a lifetime – or even within a day! There’s just way too much information to retain it all. Subconsciously, we select what to remember by choosing to forget, throwing away most information as we encounter it. Transformers (🤖) decide what to focus on at recall time. Humans (🧑) also decide what to throw away at memory-making time. Humans filter out information early and often.
If we had infinite capacity for memorisation, it’s clear the transformer approach is better than the human approach – it truly is more effective. But it’s less efficient – transformers have to store so much information about the past that might not be relevant. Transformers (🤖) only decide what’s relevant at recall time. The innovation of Mamba (🐍) is allowing the model better ways of forgetting earlier – it’s focusing by choosing what to discard using Selectivity, throwing away less relevant information at memory-making time16.
The Problems of Selectivity
Applying the Selection Mechanism does have its gotchas though. Non-selective SSMs (i.e. A,B not dependent on x) are fast to compute in training. This is because the component of
Yt which depends on xi can be expressed as a linear map, i.e. a single matrix that can be precomputed!
For example (ignoring the D component, the skip connection):
$$y_2 = mathbfCmathbfBx_2 + mathbfCmathbfAmathbfBx_1 + mathbfCmathbfAmathbfAmathbfBx_0$$
If we’re paying attention, we might spot something even better here – this expression can be written as a convolution. Hence we can apply the Fast Fourier Transform and the Convolution Theorem to compute this very efficiently on hardware as in Equation 3 below.
We can calculate Equation 2, the SSM equations, efficiently in the Convolutional Form, Equation 3.
Unfortunately, with the Selection Mechanism, we lose the convolutional form. Much attention is given to making Mamba efficient on modern GPU hardware using similar hardware optimisation tricks to Tri Dao’s Flash Attention17. With the hardware optimisations, Mamba is able to run faster than comparably sized Transformers.
Machine Learning for Political Economists – How Large Should The State Be?
The Mamba authors write, “the efficiency vs. effectiveness tradeoff of sequence models is characterised by how well they compress their state”. In other words, like in political economy18, the fundamental problem is how to manage the state.
🔁 Traditional RNNs are anarchic
They have a small, minimal state. The size of the state is bounded. The compression of state is poor.
🤖 Transformers are communist
They have a maximally large state. The “state” is just a cache of the entire history with no compression. Every context token is treated equally until recall time.
🐍Mamba has a compressed state
…but it’s selective about what goes in. Mamba says we can get away with a small state if the state is well focused and effective19.
Language Models and State Size
The upshot is that state representation is critical. A smaller state is more efficient; a larger state is more effective. The key is to selectively and dynamically compress data into the state. Mamba’s Selection Mechanism allows for context-dependent reasoning, focusing and ignoring. For both performance and interpretability, understanding the state seems to be very useful.
Information Flow in Transformer vs Mamba
How do Transformers know anything? At initialization, a transformer isn’t very smart. It learns in two ways:
Training data (Pretraining, SFT, RLHF etc)
In context-data
Training Data
Models learn from their training data. This is a kind of lossy compression of input data into the weights. We can think of the effect of pretraining data on the transformer kinda like the effect of your ancestor’s experiences on your genetics – you can’t recall their experiences, you just have vague instincts about them20.
In Context-Data
Transformers use their context as short-term memory, which they can recall with ~perfect fidelity. So we get In-Context Learning, e.g. using induction heads to solve the Indirect Object Identification task, or computing Linear Regression.
Retrieval
Note that Transformers don’t filter their context at all until recall time. So if we have a bunch of information we think might be useful to the Transformer, we filter it outside the Transformer (using Information Retrieval strategies) and then stuff the results into the prompt. This process is known as Retrieval Augmented Generation (RAG). RAG determines relevant information for the context window of a transformer. A human with the internet is kinda like a RAG system – you still have to know what to search but whatever you retrieve is as salient as short-term memory to you.
Information Flow for Mamba
Training Data acts similarly for Mamba. However, the lines are slightly blurred for in-context data and retrieval. In-context data for Mamba is compressed/filtered similar to retrieval data for transformers. This in-context data is also accessible for look-up like for transformers (although with somewhat lower fidelity).
Transformer context is to Mamba states what short-term is to long-term memory. Mamba doesn’t just have “RAM”, it has a hard drive21 22.
Swapping States as a New Prompting Paradigm
Currently, we often use RAG to give a transformer contextual information.
With Mamba-like models, you could instead imagine having a library of states created by running the model over specialised data. States could be shared kinda like LoRAs for image models.
For example, I could do inference on 20 physics textbooks and, say, 100 physics questions and answers. Then I have a state which I can give to you. Now you don’t need to add any few-shot examples; you just simply ask your question. The in-context learning is in the state.
In other words, you can drag and drop downloaded states into your model, like literal plug-in cartridges. And note that “training” a state doesn’t require any backprop. It’s more like a highly specialised one-pass fixed-size compression algorithm. This is unlimited in-context learning applied at inference time for zero-compute or latency23.
The structure of an effective LLM call goes from…
System Prompt
Preamble
Few shot-examples
Question
…for Transformers, to simply…
Inputted state (with problem context, initial instructions, textbooks, and few-shot examples)
Short question
…for Mamba.
This is cheaper and faster than few-shot prompting (as the state is infinitely reusable without inference cost). It’s also MUCH cheaper than finetuning and doesn’t require any gradient updates. We could imagine retrieving states in addition to context.
Mamba & Mechanistic Interpretability
Transformer interpretability typically involves:
understanding token relationships via attention,
understanding circuits, and
using Dictionary Learning for unfolding MLPs.
Most of the ablations that we would like to do for Mamba are still valid, but understanding token communication (1) is now more nuanced. All information moves between tokens via hidden states instead of the Attention Mechanism which can “teleport” information from one sequence position to another.
For understanding in-context learning (ICL) tasks with Mamba, we will look to intervene on the SSM state. A classic task in-context learning task is Indirect Object Identification in which a model has to finish a paragraph like:
Then, Shelby and Emma had a lot of fun at the school. [Shelby/Emma] gave an apple to [BLANK]
The model is expected to fill in the blank with the name that is not repeated in the paragraph. In the chart below we can see that information is passed from the [Shelby/Emma] position to the final position via the hidden state (see the two blue lines in the top chart).
Since it’s hypothesised that much of In-Context Learning in Transformers is downstream of more primitive sequence position operations (like Induction Heads), Mamba being able to complete this task suggests a more general In-Context Learning ability.
What’s Next for Mamba & SSMs?
Mamba-like models are likely to excel in scenarios requiring extremely long context and long-term memory. Examples include:
Processing DNA
Generating (or reasoning over) video
Writing novels
An illustrative example is agents with long-term goals.
Suppose you have an agent interacting with the world. Eventually, its experiences become too much for the context window of a transformer. The agent then has to compress or summarise its experiences into some more compact representation.
But how do you decide what information is the most useful as a summary? If the task is language, LLMs are actually fairly good at summaries – okay, yeah, you’ll lose some information, but the most important stuff can be retained.
However, for other disciplines, it might not be clear how to summarise. For example, what’s the best way to summarise a 2 hour movie?24. Could the model itself learn to do this naturally rather than a hacky workaround like trying to describe the aesthetics of the movie in text?
This is what Mamba allows. Actual long-term memory. A real state where the model learns to keep what’s important. Prediction is compression – learning what’s useful to predict what’s coming next inevitably leads to building a useful compression of the previous tokens.
The implications for Assistants are clear:
Your chatbot co-evolves with you. It remembers.
The film HER is looking better and better as time goes on 😳
Agents & AI Safety
One reason for positive updates in existential risk from AGI is Language Models. Previously, Deep-RL agents trained via self-play looked set to be the first AGIs. Language models are inherently much safer since they aren’t trained with long-term goals25.
The potential for long-term sequence reasoning here brings back the importance of agent-based AI safety. Few agent worries are relevant to Transformers with an 8k context window. Many are relevant to systems with impressive long-term memories and possible instrumental goals.
The Best Collab Since Taco Bell & KFC: 🤖 x 🐍
The Mamba authors show that there’s value in combining Mamba’s long context with the Transformer’s high fidelity over short sequences. For example, if you’re making long videos, you likely can’t fit a whole movie into a Transformer’s context for attention26. You could imagine having Attention look at the most recent frames for short-term fluidity and an SSM for long-term narrative consistency27.
This isn’t the end for Transformers. Their high effectiveness is exactly what’s needed for many tasks. But now Transformers aren’t the only option. Other architectures are genuinely feasible.
So we’re not in the post-Transformer era. But for the first time, we’re living in the post-only-Transformers era28. And this blows the possibilities wide open for sequence modelling with extreme context lengths and native long-term memory.
Two ML researchers, Sasha Rush (HuggingFace, Annotated Transformer, Cornell Professor) and Jonathan Frankle (Lottery Ticket Hypothesis, MosaicML, Harvard Professor), currently have a bet here.
Currently Transformers are far and away in the lead. With 3 years left, there’s now a research direction with a fighting chance.
All that remains to ask is: Is Attention All We Need?
1. see Figure 8 in the Mamba paper. 2. And scaling up with massive compute. 3. More specifically the scaled dot-product Attention popularised by Transformers 4. For people who don’t see Temple Run as the cultural cornerstone it is 🤣 Temple Run was an iPhone game from 2011 similar to Subway Surfer 5. Here we assume the environment is sufficiently smooth. 6. One pretty important constraint for this to be efficient is that we don’t allow the individual elements of the state vector to interact with each other directly. We’ll use a combination of the state dimensions to determine the output but we don’t e.g. allow the velocity of the runner and the direction of the closest obstacle (or whatever else was in our state) to directly interact. This helps with efficient computation and we achieve this practically by constraining A to be a diagonal matrix. 7. Concretely consider the case of Language Models – each token is a discrete step 8. ZOH also has nice properties for the initialisations – we want A_bar to be close to the identity so that the state can be mostly maintained from timestep to timestep if desired. ZOH gives A_bar as an exponential so any diagonal element initialisations close to zero give values close to 1 9. This is known as the Euler discretisation in the literature 10. It’s wild to note that some readers might not have, we’re so far into the age of Attention that RNNs have been forgotten! 11. B is like the Query (Q) matrix for Transformers. 12. C is like the Output (O) matrix for Transformers. 13. Non-alcoholic options also available! 14. Especially as all voices roughly occupy the same space on the audio frequency spectrum Intuitively this seems really hard! 15. Note that photographic memory doesn’t necessarily imply perfect inferences from that memory! 16. To be clear, if you have a short sequence, then a transformer should theoretically be a better approach. If you can store the whole context, then why not!? If you have enough memory for a high-resolution image, why compress it into a JPEG? But Mamba-style architectures are likely to hugely outperform with long-range sequences. 17. More details are available for engineers interested in CUDA programming – Tri’s talk, Mamba paper section 3.3.2, and the official CUDA code are good resources for understanding the Hardware-Aware Scan 18. or in Object Oriented Programming 19. Implications to actual Political Economy are left to the reader but maybe Gu and Dao accidentally solved politics!? 20. This isn’t a perfect analogy as human evolution follows a genetic algorithm rather than SGD. 21. Albeit a pretty weird hard drive at that – it morphs over time rather than being a fixed representation. 22. As a backronym, I’ve started calling the hidden_state the state space dimension (or selective state dimension) which shortens to SSD, a nice reminder for what this object represents – the long-term memory of the system. 23. I’m thinking about this similarly to the relationship between harmlessness finetuning and activation steering. State swapping, like activation steering, is an inference time intervention giving comparable results to its train time analogue. 24. This is a very non-trivial problem! How do human brains represent a movie internally? It’s not a series of the most salient frames, nor is it a text summary of the colours, nor is it a purely vibes-based summary if you can memorise some lines of the film. 25. They’re also safer since they inherently understand (though don’t necessarily embody) human values. It’s not all clear that how to teach an RL agent human morality. 26. Note that typically an image (i.e. a single frame) counts as >196 tokens, and movies are typically 24 fps so you’ll fill a 32k context window in 7 seconds 🤯 27. Another possibility that I’m excited about is applying optimisation pressure to the state itself as well as the output to have models that respect particular use cases. 28. This is slightly hyperbolic, the TS-Mixer for time series, Gradient Boosting Trees for tabular data and Graph Neural Networks for weather prediction exist and are currently used, but these aren’t at the core of AI
Author Bio
Kola Ayonrinde is a Research Scientist and Machine Learning Engineer with a flair for writing. He integrates technology and creativity, focusing on applying machine learning in innovative ways and exploring the societal impacts of tech advancements.
Acknowledgements
This post was originally posted on Kola’s personal blog.
Thanks to Gonçalo for reading an early draft, Jaden for the nnsight library used for the Interpretability analysis and Tessa for Mamba patching visualisations.Also see: Mamba paper, Mamba Python code, Annotated S4, Nathan Labenz podcast
Citation
For attribution in academic contexts or books, please cite this work as
Kola Ayonrinde, "Mamba Explained," The Gradient, 2024
@article{Ayonrinde2024mamba, author = Kola Ayonrinde, title = Mamba Explained, journal = The Gradient, year = 2024, howpublished = urlhttps://thegradient.pub/mamba-explained,
0 notes
ramyeongif · 11 months
Quote
Come, Boy, sit down. Sit down and rest." And the boy did. And the tree was happy.
#quotes
1 note · View note
kiernanshayemckay · 11 months
Photo
Tumblr media
Harry Winston - Round Brilliant Reviere Diamond Necklace
119 Diamonds with total weight of 12.68 Carat
Infatuation - A Michael McKendrick Billionaire Novel  
Kasey is gifted the above necklace for her birthday - in Infatuation.
“Infatuation is a journey of Kasey, Jeffery, and Michael, who hope to find love. Kasey examines her life while in New York City. She is in a long-term relationship with Jeffery Jonsson, M.D., a neurosurgeon and psychiatrist, who is desperately in love with her and lives in Washington D.C. However, in a chance encounter, she meets Michael McKendrick, a billionaire looking for his perfect match. Will Kasey turn to Jeffery for love…or Michael? Will she stay in New York City...or return home to Washington, D.C.? The question is: What are the ramifications of the journey to find love?”
Visit me on www.kiernanshayemckay.com
Book due to release on www.amazon.com in December 2023
Have a great Night
Kiernan Shaye
1 note · View note