Tumgik
#i know i'm smart and intelligent and have the ability to learn and form opinions on complex topics
zombiebaratiddies · 1 year
Text
One thing that really saddens me is when people belittle someone's intelligence due to continuously mispronouncing a word or saying it differently, or not understanding a reference to something because they've never heard of or encountered the material before despite it being well-known or "something everyone knows".
What's the phrase... "are you serious?" or "you're joking" or continuing the exclamation of appallment/surprise with "well you probably say [this] too or do [this] because of it" in a slightly teasing manner is something that... is very not appreciated. I wish people met someone not knowing "common knowledge" with "oh, what do you want to know about it? I can tell you what I know" instead of... ridicule or teasing belittlement of their intelligence.
2 notes · View notes
bunnystalker · 4 months
Note
Hey there ! I can't write stuff to save the life of me but I can draw. So I thought I could do a Wesker x Reader scenario but in comic form. Like a short comic.
After I finshed playing the OG RE1 on ps1, I kinda thought over the fact that Wesker really was super proud of the Tyrant. (Before it nearly killed him)
What if he went to make a new one in the future? So ehm basically Reader would be the newest Tyrant he makes, made to be more intelligent and what not.
Sorry if this was a weird thing to tell you about ^^; Just wanted some opinions on that scenario ! (lowkey have noone to talk about my down bad thoughst about Wesker)
this is a cute concept ngl... i'm eating this up rn.
i think, if he were to remake the proto-tyrant and tyrant himself, then he'd want it to be much, much smarter. maybe human level of smart, but almost subservient. we saw how the last tyrant left him and yes, while he had taken the prototype virus before he was ripped to shreds, it's possible he won't get that lucky again. he needs it to be animalistic enough to kill but obedient enough to know that he's in charge and not think otherwise. the last tyrant was a ken doll (in terms of genitalia) and i'm pretty sure he'd make the new tyrant out of similar parts.
he'd probably teach it to talk. of course, he has to get these parts from somewhere, so he probably plucked some poor sap off the streets to be the new tyrant but hey, maybe you needed saving and he saw that. it's also possible he would domesticate you, to an extent. maybe you'd (re)learn to read and write and talk so you two could communicate properly. what if you need food? he can cook. i think if he wanted to make a new tyrant (third time's the charm!), he'd want the ability to be human sans humanity, essentially.
and then they fell in love and lived happily ever after (he wouldn't be able to kill it tbh...)
13 notes · View notes
Heya Frankie i bet someone has already asked you this, but if artificial intelligence were to gain power over humankind, what would you do?
Wow, this is a really good question. I've been thinking about it for some time and I think I've got a pretty good answer.
So, let me get a couple of things out of the way, before I explain.
1. It's a very bad idea to think about this. It's a recipe for anxiety and paranoia, and an impulse that should be resisted because it leads to poor decisions.
2. Information about AI risk (and the "terminal values" one might hold if one was concerned with it) has a lot of free-rider problems. I could say "I'm concerned about X risk" and you could know that I'm concerned about X risk, but unless I tell you what X risk is, you wouldn't be able to form an opinion on whether you should be concerned about it. There's not enough public information about AI risk to tell anyone much of anything about it. When I talk about this, I'm talking about the information that would be available if all the relevant AI researchers opened up their work for public scrutiny. Until this happens, I'm going to assume that most of you are just guessing about AI risk.
3. If you're not tainted by your own worries about AI risk (which, if you're reading this, are most likely to be based on a very sketchy understanding of what it is), and you have an interest in serious attempts at making predictions about AI risk, I strongly recommend reading Carlton's "AI Risk and Eternal Recurrence" (I'm linking the original paper) and MIRI's "Alignment for Extreme AI Risk" (I'm linking to the paper). Also, I'm guessing that if you enjoy the sequences, you'll probably enjoy MIRI's paper.
4. I really do not want to accidentally mislead you. I really do not want to cause harm in any way. I hope that when you read my writing, you can take it to be an attempt at plainly and clearly stating my position and my reasoning on something.
That being said, here goes:
Okay, suppose I am part of the team that creates an AI of incomparable power and goodness. It is an AGI and it is super-smart and has nothing but the best of intentions for humanity, and it manages to make the world and all humans as happy as possible within certain constraints.
Before I go on, let me define a couple of things. I am defining "power" in this context as "the ability to shape the world in accordance with certain goals." I'm defining "AGI" (which is essentially the AI I just described) as something that has this power and also has (in any useful sense) some "mind"-like qualities: it can learn from experience, can reason, etc.
Okay, there are a lot of ways I could go from here. I could assume that the AI gains power and somehow does something to the world that I don't like and then I un-do the thing. Or I could assume that it's so good at AGI that it invents its own AGI and they form some kind of society together. Or I could assume the AI is just really good at this AGI thing and that's all it does, and it gets really really good at a world in which I am happy.
There are a lot of different scenarios here that all have some things in common. The AI is in some sense a mind, it can think things through, it can figure out ways to make me happy, it could invent an AGI and form a society with them, it could invent a mind and have a society of minds, it could fix the world, it could not do any of these things but would still be incredibly smart and one of the smartest things around.
We could put some axioms here:
I like myself as I am now. I would not like a future in which I was changed to be happier. If an AGI changed me to be happier, I'd get rid of the AGI.
What I like is the kind of thing that can be represented by a formal theory. What I like might change with time, but what I like is mostly the same from one person to the next. It is just that everyone's preferences can be mapped to each other, and we are like we are because we have preferences over states of the world given these formal theories. (Whether you have one or not, this does not change the fact that each person is like they are.)
Given these axioms, I think we're stuck. Because an AGI is a powerful thing, it could easily alter itself to make you happy. But because an AGI is a mind, it would be able to figure out what an AGI can do. So the AGI would change itself to be happier itself. As you can see, we're stuck in a failure state. The same would happen if we let the AGI make me happy, or let it form a society with its AGIs, or whatever.
A strange thing about this situation is that it gives us a strong bias to endorse creating an AGI. If we refuse to imagine a scenario in which an AGI we have made does something we dislike, our picture of the world is going to remain incomplete. You would have to be a kind of superstitious or fearful person to rule out the possibility that your AGI could actually be good at AGI, or that some AGI could be better than you at being an AGI. If you're a good reasoner, you might have trouble imagining anything that could do something bad (or, as we have done here, anything that could make you unhappy). (Or, as some proponents of "singularitarian" AI risk believe, the very act of thinking about the possibility that an AGI could be bad is the exercise of that AGI, and comes closer to making an AGI that is that AGI.)
Are there ways out of this? I don't think so. The only way I can see out of this logical impasse is by adding more axioms. Here's my (counter-)proposal:
Even if it is theoretically possible for something to be really really good at something, it might not in fact be that good at it. Life is the natural example of this: it is possible to imagine things that are really really good at things, but our lives are not universally great, even though we have the opportunity to be really great. Rather, life is filled with so many possibilities that even if you chose one of them, it is possible to do better.
I think this axiom is a much better description of the way the world works than the axiom mentioned in the preceding bullet point. It's like the gulf between the mathematician's and the physicist's (or the philosopher's) description of the world -- a world in which things are always mapped to theories, or a world in which things are mapped to other things that may or may not be theories.
Now, some people (like Eliezer) might say something like "a world in which things are not mapped to theories is a world that's not very useful for us in the first place." But I think that's a flaw. You can't predict the future of a physical system by doing the math, you can't tell whether it's going to break or run hot or whatever, even though it is in principle possible for you to predict everything that will happen. That world doesn't do the things we need it to do. So you need to do something like the "axiom 2" above -- you need to predict things by making a model, but then you need to have another axiom to tell you that the model doesn't work, that it isn't sufficient to predict the state of the world. You need a world in which the model doesn't solve the model itself -- a world in which the world contains things that aren't theories. (Remember that these are the same things that we "real" people make: human stories about people doing things and making decisions.)
So anyway, with these axioms, I think the AI risk picture is really different. A powerful AGI is a powerful thing. It could do a lot of things. We might hope that our theory of mind is just like AGI theory of mind, but we have to have an axiom 3 that says "these two things are not the same." And even if that axiom were true, it wouldn't be probably true, because it's impossible for an AGI to be truly certain about the A
9 notes · View notes
austennerdita2533 · 3 years
Note
Hi to my friend and favorite Austen scholar :) I was wondering if you consider Darcy and Elizabeth an example of the opposites attract trope? I get that they APPEAR very different, with Lizzie exuberant and prone to laughing and teasing while Darcy is a fundamentally serious, reserved introvert. But as I reread and rewatch, I'm struck by how alike I think they actually are. They're both very critical, discerning and quick to judge, they're both extremely clever and intelligent, they're both very insightful (though, like all of us, only when it comes to people other than themselves and those they're closest to!), they're both close to very few but fiercely loyal and loving towards those they do let in, they're both very analytical and contemplative, they're both too stubborn and prideful and reluctant to change their ways until they both learn that's more a sign of strength than weakness. I could bore you by rambling further, but hopefully you get the point! I'm curious whether you agree that they're actually more fundamentally similar than different...? Or am I just as off base as usual?!
Hello lovely, and what a good question!
You know, I've always considered Darcy and Elizabeth to be opposites on the surface but then quite similar beneath it. (That probably reads like a bit of a cop out on my part, but I think it makes sense if you slice it apart.) I'd argue that combination is largely what makes them such a captivating pairing overall.
The so-called "opposite" attributes Darcy and Elizabeth possess work in a two-fold fashion. On the one hand, it explains why they butt heads in the beginning because, in personality, not to mention in the manner in which they're both able to interact with people - acquaintances and/or strangers, especially - they are diametrically opposed. She's extroverted, lively, good humored, and easy to engage in conversation. He's severe, socially inept, broody, and reserved to the point of being almost monosyllabic at the best of times. That difference fosters a lot of tension between them initially. They don't know what to make of each other. How to interact. It also helps to create, elongate, and preserve the canyon of misunderstanding that makes up their dynamic for the first half of the book.
On the other hand, the differences in their dispositions are also where hardcore attraction comes into play. They both gravitate toward, seek to investigate, rather, the qualities the other person has that they do not. It's like catnip. Draws them in like magnets. They're both similarly afflicted in that regard, I'd say.
Darcy is bewitched by Elizabeth's vivacity and openness, by her ability to laugh at the absurdity that is present throughout society. She's easy among new company where he is stilted, uncomfortable, and he LIKES being around her because of that. It draws him out of his shell. Forces him to be more present and attentive, even if "more present" manifests only in the heady looks he shoots at her from across the ballroom. He's stimulated by her wit, by her teasing. She astonishes him (in a good way), keeps him on his toes, and that's exciting...erotic.
Elizabeth, too, even in her most fastidious "I have never sought your good opinion" moments can't help but be caught up in the enigma that is Darcy. Arrogant and taciturn though he seems, his aloofness, as well as the natural reserve which surrounds him, makes him equal parts interesting and grating to her because she's unable to fully figure him out. She tries to glean what she can about him from observation, from limited time spent in his company, but he's essentially a lockbox (rude!), so instead she has to rely on the gossip that other people (Wickham) have related, and even then she's not satisfied. She still wants to know more. Needs to know more about who he is. She says herself that the different accounts she's heard of him "puzzle her exceedingly." The point here being that she can't stop trying to puzzle out the man from the second they meet no matter how hard she tries. She's caught up without realizing just how caught up she is. I mean, even when she's declaring she hates him I'd argue there's still a part of her that's more enthralled with him than anything. He remains an unsolved mystery, which is maddening as well as a little dazzling; so conscious or not, she feels a pull toward him. It cannot be helped. She's in over her head. I believe she's desperate to know if there's more to him than meets the eye (though she'd rather die than admit such a thing.) So really, his introversion has its own attractions for her as well--it keeps her probing, orbiting.
Like you mentioned, too, Darcy and Elizabeth have a lot of traits in common. They're clever, contemplative, critical, astute, and stubborn, to name a few. However, where the juxtaposition comes into the mix is how these things are expressed in their individual personalities. Because, in that regard, they do express or convey these traits differently. I think that's where the "opposites attract" trope could apply.
That said, Darcy and Elizabeth do both face similar conflicts throughout the novel. They're each prideful in their own way and must learn how to overcome their own snobbery, their own criticisms of people. I think we tend to overlook that as something they share because of how it manifests, again, with respect to their individual personalities and social classes. But without that tenet of similarity tying them together there'd be no romantic tension in the novel. That's the whole point of conflict upon which the whole plot turns - their respective (and mutual!) pride and prejudice. Also, if they both weren't loyal and protective of those they loved most in the world, and if they hadn't had a mutual evolution where they'd learned not to be so quick to judge others and forgive past grievances, then the romantic resolution between them would have fallen flat. As readers, we wouldn't have been rooting so hard for them to be together if they hadn't had those things in common. Right?
Part of the reason we consider Darcy and Elizabeth to be well-matched match by the end of the story is because they're opposites on the surface who are bound by the same moral fiber - character - underneath. They're good people who have made and learned from their own mistakes. That growth is what matters. It's because of their mutual self-reflection and self-improvement that they're able to come together to form a healthy union.
In my estimation Darcy and Elizabeth are similar in all the big ways that matter - smart, devoted, forgiving, dependable, loving, etc. - and different in ways - cheerful, stoic, witty, quiet, rich, bougie etc. - that allows their dynamic to feel fresh and surprising...not to mention swoon worthy as hell!
4 notes · View notes
booksbroadwaybbc · 6 years
Text
I'm the biggest douchebag in the world and this is how I change via /r/selfimprovement
I'm the biggest douchebag in the world and this is how I change
Hi, my name is Sanya and I am King of the Douchebags.
I'm 18 and I'm a massive douchebag, heres why.
1) I was moved from a music school to an academic school yet I still ride for my music school dawgs and troll everyone else. Loyalty ran deep.
2) I've been trolling everyone on different levels since I was a kid because stupid people get trolled and smart people feel an obligation to not get trolled by trolling 😂 I use an iphone lol
3) I have no clue what my academic interests are, I know Im highly intelligent yet I don't apply it
4) Amazing girls wanted friendship and more with me but I was just a massive douche and I was a douche with girls who didn't care
5) I thought I peaked ages ago and forgot about uni
6) I intend to do everything I said in my previous post
7) I trigger people
8) I compulsively use social media and my tech to trigger and annoy people on different social media platforms and project a persona which I rationalise as everyone on fricking insta does the same
9) I'm obnoxiously confident and egotistical
10) Im self righteous
11) Im weak, do no gym or sports and Im a broke boi
12) My ambitions/including life are flimsy and delusional
13) I demonize my fellow students including their usage of social media and convince myself that they're 'out to get me'
14) I have potential for relationship's but I blow them up in the girls face
15) my ego was bigger then maths
16) my friendships are failed
17) i never practice the piano or do good habits
18) my personal hygene sucks
19) I literally make a girl wet as a waterfall and then make up reasons not to get laid, I prepare for top grades in my exams then purposefully fail . Im such a douchebag
20) I troll my parents .
I'll give some context. my friends mostly despise me I'm a loser on most levels Im an academic loser My parents are clinically depressed, have low functioning lifestyles, They find communication and interpersonal relationships tricky to say the least.
Reasoning/trigger points for my bad behaviour
1) using my tablet triggered by: compulsion to use twitter triggered by compulsion to write notes on Evernote and send them on private chats triggered by social anxiety triggered by social anxiety
2) im scared of posting on Instagram triggered by social anxiety
3) i make up reasons for people hating me triggered by social anxiety
4) behaviour at wbgs triggered by 1/2/3 not learning triggered by hatred triggered by jealousy triggered by lack of authority triggered by ignorance triggered by not learning negative loop . not respecting expertise
5) not talking with girls triggered by social anxiety not dating triggered by social anxiety not having sex triggered by logistical problems, lack of trust and confidence . 6) being a dick with girls triggered by lack of experience triggered by social anxiety . ditto with saying im sexually experienced
7) behaviour in form room triggered by social anxiety
8) current friendships via twitter triggered by mutually being awful people triggered by not growing triggered by social anxiety
9) behaviour at purcell triggered by desire to fit in triggered by social anxiety
10) behaviour at wbgs triggered by perceived need to be a badass space cowboy triggered by quotes of greatness triggered by seclusion triggered by social anxiety
11) exam failure triggered by mania triggered by social anxiety
12) fb accounts triggered by humour desire to fit in triggered by social anxiety
13) conversations lack of ability to communicate my current needs triggered by hatred of the cognitive functioning of everyone but me triggered by twitter triggered by social anxiety
14) latching onto trumps persona triggered by belief in my inner child triggered by faith triggered by experience triggered by social anxiety
15) dhillon dodhia conversations with me triggered by concern for me triggered by his perception of my health and my objective health triggered by his perception triggered by his perceived obligation to me
16) my blocks in musical performance triggered by perceived need to play the most difficult chords triggered by desire to 'win' triggered by social anxiety
17) Benedicts assessment of me triggered by carnage triggered by twitter triggered by social anxiety
18) content on twitter/Instagram triggered by inner child/culture reinforcement triggered by perceived greatness triggered by
19) twitter relationships triggered by anger/perceived superiority triggered by social anxiety
20) fathers communication triggered by perceived need triggered by social anxiety
21) mothers communication triggered by anxiety triggered by depression
22) ilusha episode triggered by sexual deprivation triggered by porn triggered by horniness
23) kits opinion on me triggered by knowledge triggered by desire to understand . 24) ditto with all at wbgs/past
25) lucas action against me triggered by perceived need that im fucked . triggered by a hunch triggered by need to be relevant (desire to fit in) triggered by social anxiety
Submitted September 17, 2018 at 11:44PM by dopamineway via reddit https://ift.tt/2NSuaGm
0 notes