This was the first draft of franky and loids visit to the bdsm club.
The version I have now is more in character. I don’t think Loid would be this snobbish about sex, I don’t think franky could trick Loid into going anywhere, Europe doesn’t have us tipping culture, etc.
But! I had a lot of fun writing this so I wanted to share anyway
Franky and Loid play off of each other really well and are really fun to write. In the future I’d like to write them platonically having sex bc idk I think it would be funny 🤷
Enjoy!
Loid regrets every scrap of trust he’s even given Franky as he opens to the doors of what can only be a sex club. Loid had guessed the address was going to take him to the seedier part of town but Franky had laughed it off with a “Hey trust me, a change of pace will be nice!”
His casual tone had Loid hoping that perhaps it was just a hole-in-the-wall bar with some unpleasant neighbors but he now recognizes that the thought was foolishly optimistic. Apparently so was his thought that he was only entering a sex club as he gives the bouncer the passphrase and Franky’s name and he is led to a private table overlooking the center of the club.
This place must be a fetish sex club if the unusual devices, elaborate knots, and shiny leather are anything to go by. Loid gives the space a passing glance to ensure there are no threats and focuses intently on not making eye contact with the occasional curious glance his stiff posture gets.
He arrives at the table Franky had reserved and gives the waitress a polite nod of thanks as he goes to sit down. She stays and looks at him expectantly until he realizes she is expecting a tip and he quickly pulls out a wallet and gives her apparently the appropriate amount of dalcs.
“Oh good, you made it.” Franky says as Loid sits down primly, crossing his legs and shutting his eyes so he won’t glare at Franky as the other man attempts (poorly) to hide his amusement at Loid’s flustered interaction with the waitress.
“Franky, why the fuck are we here?” Loid asked, deciding that hiding his glare is a politeness Franky does not currently deserve.
“What do you mean?” Franky says with an affected air of casualness. “I thought a change of pace would be nice! Don’t you get tired of meeting at the same three places?”
Loid crosses his arms and stares evenly at Franky. The silent treatment is a sure-fire way to wear him down. It’s usually better to let Franky explain himself instead of succumbing to his goading.
“Fine, fine” Franky starts, not letting Loid’s silent treatment stretch out long. “I honestly wasn’t even sure if you would show up.”
Loid’s face shifts with affront at the insult to his professionalism. He shifts to smooth it out but Franky catches the expression before it’s concealed.
“No offense!” Franky continues “I know you would do anything for your mission which I’m serious about the change of pace!”
Franky looks at him expectantly as he gestures to the activities below them. Loid is lost so he nods for Franky to keep going.
“Look,” Frankie says, “I’m probably the closest thing you have to a friend and I’m honestly a little worried about you.”
The concern would almost be touching if Loid weren’t so confused and they weren’t in a fucking sex club.
“WISE is working you to the bone and you’ve been managing so far but it’s only a matter of time before something gives and when it does someone is going to get seriously hurt and you’ll be lucky if it isn’t you.”
Loid tips his head to concede the point. Since Operation Strix started there has scarcely been a moment where he isn't Twilight the superspy or head of the Forger household. Even his sparse freetime is spent understanding Anya’s TV shows, reading up on parenting and learning new recipes. He can’t recall the last time he wasn’t actively doing his job or holding up a front.
“So why are we here?” Loid asks, Franky’s explanation not connecting with their location.
“You need to find a way to let go, if you don’t you’re going to explode.” Franky pantomimes Loid’s explosion with his hands and little booming noises. “This is just one option of many.”
Franky points to a leather-clad woman spanking the man spread across her lap with a flogger. Loid quickly looks away as the man lets out what he can assume is a breathy moan but it is lost in the noise of the crowd and the music.
“Is that the only reason you brought me here? No new intel or secret sources?” Loid asks, not bothering to hide his frustration. “No secret plots that would absolutely require me being here?”
“Well no, but I figured this-” Franky gestures around them and Loid knows better than to look this time. “-would kickstart your imagination, if only to help you think of something else that would help.”
Franky coughs and Loid knows him well enough to recognize he’s hiding snicker. Loid gives him a pointed look.
“I also thought it would be funny.” Franky admits before continuing. “I’m serious though! You need to find an outlet to relax!”
“Is that all?” Loid asks, straightening his posture. Franky looks disappointed but nods.
“Thank you for your concern, but I can take care of myself.” Loid stands up and buttons his suit jacket. “Good night Franky, let’s not do this again.”
Loid does not storm out because he is an adult with self-control but he does firmly reject another waitress’s offer of assistance with more harshness than necessary. Loid’s damned if he spends another fucking cent in this place and he’s not getting tricked again, by Franky or anyone else.
2 notes
·
View notes
Deploy ML Models On AWS Lambda - Code Along
Introduction
The site Analytics Vidhya put out an article titled Deploy Machine Learning Models on AWS Lambda on Medium on August 23, 2019.
I thought this was a great read, and thought I might do a code along and share some of my thoughts and takeaways!
[ Photo by Patrick Brinksma on Unsplash ]
You should be able to follow along with my article and their article at the same time (I would read both) and my article should help you through their article. Their sections numbers don’t make sense (they use 2.3 twice) so I choose to use my own section numbers (hopefully you don’t get too confused there).
Let’s get going!
Preface - Things I Learned The Hard Way
You need to use Linux!
This is a big one. I went through all the instructions using a Windows OS thinking it would all work the same, but numpy on Windows is not the same as numpy on Linux, and AWS Lambda is using Linux.
After Windows didn’t work I switched to an Ubuntu VirtualBox VM only to watch my model call fail again! (Note that the issue was actually with how I was submitting my data via cURL, and not Ubuntu, but more on that to come). At the time I was able to deploy a simple function using my Ubuntu VM, but it seemed that numpy was still doing something weird. I thought maybe the issue was that I needed to use AWS Linux (which is actually different from regular Ubuntu).
To use AWS Linux, you have a few options (from my research):
Deploy an EC2 instance and do everything from the EC2 instance
Get an AWS Linux docker image
Use Amazon Linux WorkSpace
I went with option #3. You could try the other options and let me know how it goes. Here is an article on starting an Amazon Linux WorkSpace:
https://aws.amazon.com/blogs/aws/new-amazon-linux-workspaces/
It takes a while for the workspace to become available, but if you’re willing to wait, I think it’s a nice way to get started. It’s also a nice option if you’re running Windows or Mac and you don’t have a LOT of local memory (more on that in a minute).
Funny thing is that full circle, I don’t think I needed to use Amazon Linux. I think the problem was with how I was submitting my cURL data. So in the end you could probably use any flavor of Linux, but when I finally got it working I was using Amazon WorkSpace, so everything is centered around Amazon Linux WorkSpace (all the steps should be roughly the same for Ubuntu, just replace yum with apt or apt-get and there are a few other changes that should be clear).
Another nice thing about running all of this from Amazon WorkSpace is that it also highlights another service from AWS. And again, it’s also a nice option if you’re running Windows or Mac and you don’t have a LOT of local memory to run a VM. VMs basically live in RAM, so if your machine doesn’t have a nice amount of RAM, your VM experience is going to be ssslllooowwwww. My laptop has ~24GB of RAM, so my VM experience is very smooth.
Watch our for typos!
I reiterate this over and over throughout the article, and I tried to highlight some of the major ones that could leave you pulling your hair out.
Use CloudWatch for debugging!
If you’re at the finish line and everything is working but the final model call, you can view the logs of your model calls under CloudWatch. You can add print statements to your main.py script and this should help you find bugs. I had to do this A LOT and test almost every part of my code to find the final issue (which was a doozy).
Okay, let’s go!
#1 - Build a Model with Libraries SHAP and LightGBM.
The article starts off by building and saving a model. To do this, they grab a open dataset from the package SHAP and use LightGBM to build their model.
Funny enough, SHAP (SHapley Additive exPlanations) is a unified approach to explain the output of any machine learning model. It happens to also have datasets that you can play with. SHAP has a lot of dependencies, so it might be easier to just grab any play dataset rather than installing SHAP if you’ve never installed SHAP before. On the other hand, it’s a great package to be aware of, so in that sense it’s also worth installing (although it’s kind of overkill for our needs here).
Here is the full code to build and save a model to disk using their sample code:
import shap
import lightgbm as lgbm
data,labels = shap.datasets.adult()
params = {'objective':'binary',
'booster_type':'gbdt',
'max_depth':6,
'learning_rate':0.05,
'metric':'auc'}
dtrain = lgbm.Dataset(data,labels)
model = lgbm.train(params,dtrain,num_boost_round=100,valid_sets=[dtrain])
model.save_model('saved_adult_model.txt')
Technically you can do this part on any OS, either way I suggest creating a Python venv.
Note that I’m using AWS Linux, which is RedHat based so I’ll be using yum instead of apt or apt-get. First I need python3, so I’ll run:
sudo yum -y install python3
Then I created a new folder, and within that new folder I created a Python venv.
python3 -m venv Lambda_PyVenv
To activate your Python venv just type the following:
source Lambda_PyVenv/bin/activate
Now we need shap and lightgbm. I had some trouble installing shap, so I had to run the following first:
sudo yum install python-devel
sudo yum install libevent-devel
sudo easy_install gevent
Which I got from:
https://stackoverflow.com/questions/11094718/error-command-gcc-failed-with-exit-status-1-while-installing-eventlet
Next I ran:
pip3 install shap
pip3 install lightgbm
And finally I could run:
python3 create_model.py
Note: For file creating and editing I use Atom (if you don’t know how to install atom on Ubuntu, open a new terminal, then I would check out the documentation and just submit the few lines of code from command line).
#2 - Install the Serverless Framework
Next thing you need to do is open a new terminal and install the Serverless Framework on your Linux OS machine.
Something else to keep in mind, if you try to test out npm and serverless from Atom or Sublime consoles, it’s possible you might hit errors. You might want to use your normal terminal from here on out!
Here is a nice article on how to install npm using AWS Linux:
https://tecadmin.net/install-latest-nodejs-and-npm-on-centos/
In short, you need to run:
sudo yum install -y gcc-c++ make
curl -sL https://rpm.nodesource.com/setup_10.x | sudo -E bash -
sudo yum install nodejs
Then to install serverless I just needed to run
sudo npm install -g serverless
Don’t forget to use sudo!
And then I had serverless running on my Linux VM!
#3 - Creating an S3 Bucket and Loading Your Model
The next step is to load your model to S3. The article does this by command line. Another way to do this is to use the AWS Web Console. If you read the first quarter for my article Deep Learning Hyperparameter Optimization using Hyperopt on Amazon Sagemaker I go through a simple way of creating an S3 bucket and loading data (or in this case we’re loading a model).
#4 - Create the Main.py file
Their final main.py file didn’t totally work for me, it was missing some important stuff. In the end my final main.py file looked something like:
import boto3
import lightgbm
import numpy as np
def get_model():
bucket= boto3.resource('s3').Bucket('lambda-example-mjy1')
bucket.download_file('saved_adult_model.txt','/tmp/test_model.txt')
model= lightgbm.Booster(model_file='/tmp/test_model.txt')
return model
def predict(event):
samplestr = event['body']
print("samplestr: ", samplestr)
print("samplestr str:", str(samplestr))
sampnp = np.fromstring(str(samplestr), dtype=int, sep=',').reshape(1,12)
print("sampnp: ", sampnp)
model = get_model()
result = model.predict(sampnp)
return result
def lambda_handler(event,context):
result1 = predict(event)
result = str(result1[0])
return {'statusCode': 200,'headers': { 'Content-Type': 'application/json' },'body': result }
I had to include the ‘header’ section and the body. These needed to be returned as a string in order to not get ‘internal server error’ type errors.
You’ll also notice that I’m taking my input and converting it from a string to a 2d numpy array. I was feeding a 2d array similar to the instructions and this was causing an error. This was the error that caused me to leave Ubuntu for AWS Linux because I thought the error was still related to the OS when it wasn’t. Passing something other than a string will do weird things. It’s better to pass a string and then just convert the string to any format required.
(I can’t stress this enough, this caused me 1.5 days of pain).
#5 - YML File Changes
Next we need to create our .yml file which is required for the serverless framework, and gives necessary instructions for the framework.
For the most part I was able to use their YML file, but below are a few properties I needed to change:
runtime: python3.7
region : us-east-1
deploymentBucket:
name : lambda-example-mjy1
iamRoleStatements:
- Effect : Allow
Action:
- s3:GetObject
Resource:
- "arn:aws:s3:::lambda-example-mjy1/*"
WARNING!!!
s3.GetObject should be s3:GetObject
Don’t get that wrong and spend hours on a blog typo like I did!
If you need to find your region, you can use the following link as a resource:
https://docs.aws.amazon.com/AmazonRDS/latest/UserGuide/Concepts.RegionsAndAvailabilityZones.html
Also, make sure the region you specify is the same region that your S3 bucket is located in.
#6 - Test What We Have Built Locally
Before you test locally, I think you need your requirements.txt file. For now you could just run:
pip freeze > requirements.txt
Now, if you’re like me and tried to run it locally at this point, you probably hit an error because they never walked you through how to add the serverless-python-requirements plugin!
I found a nice article that walks you through this step:
https://serverless.com/blog/serverless-python-packaging/
You want to follow the section that states ...
“ Our last step before deploying is to add the serverless-python-requirements plugin. Create a package.json file for saving your node dependencies. Accept the defaults, then install the plugin: “
In short, you want to:
submit ‘npm init’ from command line from your project directory
I gave my package name the same as my service name: ‘test-deploy’
The rest I just left blank i.e. just hit enter
Last prompt just enter ‘y’
Finally you can enter ‘sudo npm install --save serverless-python-requirements’ and everything so far should be a success.
Don’t forget to use sudo!
BUT WE’RE NOT DONE!!! (almost there...)
Now we need to do the following (if you haven’t already):
pip3 install boto3
pip3 install awscli
run “aws configure” and enter our Access Key ID and Secret Access Key
Okay, now if we have our main.py file ready to go, we can test locally (’again’) using the following command:
serverless invoke local -f lgbm-lambda --path data.txt
Where our data.txt file contains
{"body":[[3.900e+01, 7.000e+00, 1.300e+01, 4.000e+00, 1.000e+00,
0.000e+00,4.000e+00, 1.000e+00, 2.174e+03, 0.000e+00, 4.000e+01,
3.900e+01]]}
Also, beware of typos!!!!
Up to this point, make sure to look for any typos, like your model having the wrong name in main.py. If you’re using their main.py file and saved your model with the name “saved_adult_model.txt”, then the model name in main.py should be “saved_adult_model.txt”. Watch out for plenty of typos, especially if you’re following along with their code! They didn’t necessarily clean everything :(. If you get stuck, there is a great video at the end of this blog that could help with typo checking.
#7 - The Moment We’ve All Been Waiting For, AWS Lambda Deployment!
Now we need to review our requirements.txt file and remove anything that doesn’t need to be in there. Why? Well, this is very important because AWS Lambda has a 262MB storage limit! That’s pretty small!
If you don’t slim down your requirements.txt you’ll probably hit storage limit errors.
With that in mind, I recommend taking only the files and folders needed and move them to some final folder location where you’ll run “serverless deploy”. You should only need the following:
main.py
package.json
requirements.txt
serverless.yaml
node_modules
package-lock.json
Now from command line within our final folder, we just run ..
serverless deploy
If all goes well, this should deploy everything! This might take a minute, in which case get some coffee and enjoy watching AWS work for you!
And if everything went as planned, then we should be able to test our live deployment using cURL!
curl -X POST https://xxxxxxxxxx.execute-api.xx-xxxx-1.amazonaws.com/dev/predictadult -d "39,7,13,4,1,0,4,1,2174,0,4,33"
Hopefully in the end you receive a model response like I did!
Resources:
https://medium.com/analytics-vidhya/deploy-machine-learning-models-on-aws-lambda-5969b11616bf
https://serverless.com/framework/docs/providers/aws/guide/installation/
https://docs.aws.amazon.com/AmazonRDS/latest/UserGuide/Concepts.RegionsAndAvailabilityZones.html
https://serverless.com/blog/serverless-python-packaging/
And here is a great live demo from PyData Berlin!
0 notes