# aoreugif.net

the longest journey – the blog of J.B. Figueroa

## Live Updates: Post Numbered as #1 3/12/17

Hello world

Having a post ready by Sunday evenings is something I want to make a habit of. Habits are made by routinely doing something, so I should begin routinely doing it to get started. Nothing that I write is ever publication ready, but when does that ever really matter? It’s like training for a marathon. If you don’t even start training, how can someone even finish? The hardest part is sometimes just putting on your shoes in the early dawn of morning. The analogy here is, putting pen to paper, or finger to keyboard. It’s Sunday night. I just drank a cup of coffee, and I have nothing planned for tomorrow. Here are somethings I want to get done, and I guess I can try “live blogging”  while doing it. A quick internet search reveals web pages comparing this to live television.Wikipedia reads:

A live blog is a single post which is continuously updated with timestamped micro-updates which are placed above previous micro-updates.”

So, it’s “timestamped micro-update and tonight, I’m going to water my plants. Just kidding, but tomorrow I have to make a judgment call on the stuff germinating. I don’t know what unmoist soil looks like exactly, but that’s a topic for another day. Tonight, my to do list.

1: I have 1, 2 drafted  blog posts I want to actually properly post. One’s the previous one before this on worms and vermicomposting which is still titled with “Draft” in its name.The other is a somewhat-long post on the vim editor that’s been sitting in my draft folder for.. over 12 months. If there’s a 3rd it would be the one on trajectory path planning, but for now, let’s just edit and make the worm blog post reader ready. I have a couple of videos posted on my vimeo account, but the free version limits me to 500 MB uploads per week. Enough for a 4 minute video. There are some bandwidth related adjustments I need to do. If you’re reading this, understand that bandwidth is money. These textual words, and pixels that make up images all cost bandwidth; and yes, bandwidth is money. So, I’m using a 3rd party service to host videos, and I’ve been deciding on how I should do images. The current hosting provider changes the tier pricing for this website sometime around mid-November, or early-November. At my pricing plan, I’m paying for about 10 Gbs upload/download regardless if I use it or not. Storage itself is a variable cost, so I want smaller files if I plan to host them on here. CPU/RAM costs associated with the back end stuff is free now. Woot? I should double verify that.

Through out the last month or so, I took some videos, and photos. I need to reduce the file size on the photos, and I need to figure out how to use a video editor in order to upload stuff on Vimeo. I could use YouTube and not worry the limits on Vimeo, but I’m going through this “Anti-Google” stage right now. Notice how above it reads, “…quick internet search reveals web pages comparing…” and not, “I googled for XYZ?” I’m trying to get rid of it form my vocabulary. It’s a freedom and privacy thing, but freedom isn’t free and neither is bandwidth. So, how to upload? What to upload?

2. I want to make ISK. What is ISK? It’s “Intersteller Credits” for an on-line game provided by CCP. It’s also the abbreviation for the currency for Iceland, which is where CCP’s  HQ is located in. I make the majority of my ISK through market and trading. This means, a lot of spreadsheets, and more importantly, programming. The more independent you are on information, the more money you make, but it requries some time invested on some platform that you create. In tonight’s scenario, I want to automate my API calls from eveMarketer’s API. They use “swagger.” The new* “ESI”, (is it called ESI?) also uses swagger. What is swagger? It’s this: https://swagger.io/specification At some point, I’d like to tab into the actual CCP’s ESI and get a permission token to grab data without being dependent on EVEmarketer. I learned my lesson with eve-central. They’re gone, and partially it’s because CPP started using swagger. Other websites quickly took over their market share. Right now though, I want to use something I’m familiar with, and eveMarketer’s API is almost identical to eve-central’s API.Here are their URLS

https://evemarketer.com/
http://eve-central.com/

And yes, I could just use Google Doc’s “IMPORTxml” function, but everyone and their mom does that. I’ve been using it for the past couple of years. Everyone’s spreadsheet has a backend that looks the same. I want something different. Also, I’m rebelling against Google right? And also-also, online spreadsheets is fun and all, but those function calls won’t work if you’re offline. I want a way to save API data locally, and automatically so I can start building some historical data to compare against.

====

Alright, so on topic 1. Last night, I installed a program for video editing. Humm, before I start posting screen shots I should probably go through the part on uploading images. There’s a program I had in mind for that, one that you can apparently control via command line. Let me find it..

====

Before I even start that I need to do a quick wordpres update. WP-CLI to the rescue! Good to know crayon-te-content is still working.

With that, we should have a nice a healthy wordpress.
I think that program was called image magick: https://www.imagemagick.org/script/index.php

It advertised itself as something you can control via scripts, and the reason why I’m leaning towards it is two-fold. Another blogger (am i one?) recommended it, and it is apparently already installed on debian. Simple type, “display” In a long ago conversation with @georgiecel

[…] Sure the images are high resolution. But when it comes to my blog, I resize to a maximum of 1600 pixels on the long edge, but serve slightly smaller versions depending on display resolution. After I resize them (using Photoshop – though there are some tools out there that use the command line and do it as an automatic process, I think one was called imagemagick?) I save them at 60% quality JPEG. You’ll be surprised how invisible the loss in quality is. 60% cuts filesize down a lot without sacrificing too much quality. Then I run the images through the app ImageOptim just to cut them down about 1% more. If you go to my much older posts (like 2013 or something?) you will notice that the images are not as optimised and take up so many megabytes! [….] But yeah, no need to serve images much bigger than 1000-2000 pixels in width unless you really want to care about 4K monitors. The images on Stocksnap are given as a more raw 100% quality large size image, so that you have flexibility to play around with the image as much as you need.”

So ImageOptim is apparently some Map app and it’s offered on the web as a service. https://imageoptim.com/mac. It’s promotional material advertises itself as, “.. saving on bandwidth.” They’re talking my language now. I mean, I would save on bandwidth by not posting up photos, but then this website wouldn’t be as pretty would it? There’s also a github repository: https://github.com/ImageOptim /ImageOptim Humm.. for a Mac. I don’t use a Mac. I’ll spend some time reading through this and Imagemagick’s documentation and then decide on a strategy for script making. That, or I’ll just look up some tutorial somewhere. The goal is simple: reduce file size, keep faithful to original image.

===

What did they call it, loss-less compression? The “El Dorada” of all JPEG algorithms? Nay, jpeg uses lossy compression which is different than lossless compression. Lossy differs from lossless.

===

And now I’m looking at https://trimage.org/inspired” by ImageOptim. Wootness to lossless compression!

easy AF

If I were to make a real-time vision application, I wouldn’t care abouts cost. I’d care about speed. Humm.. trade-offs..

===

Cycling through the images I’m looking to upload, I realize that I don’t need to worry about blurry pixelated photos. My lame photography skills already takes care of that. It’s hard to pick out one that isn’t blurry. Capturing photos is an ability I need to put more skillpoints in.

Another way to save on bandwidth, is by not creating so many revisions that your website has to keep in storage @_@

Some picture fun. Here’s a slightly compressed image of a no-chill label.

It’s a large photo actually. The HTML to this text is actually resizing it with width=”300″ height=”169″. You, the reader are downloading the full image, but your browser reads those parameter and adjusts it locally on your screen. This probably doens’t have to happen. In fact, it doesn’t. This proves my point. Your browser is adjusting it to a lower resolution anyways, it’s a large photo to fit on screen (let alone to embed with text), and it takes up 2.5Mb even while it’s compressed. The original photo was a1 = 2,574,171 bytes and it was compressed down to b1 = 2,530,408, so 1 – c1 = 0.017 or 1.7%. Here’s a screen shot of that, a compressed screen shot.

So, the program isn’t lying to you when it gives your the compression ratio. It just reads the file size before and after. I also compressed this uploaded screen shot.With a2 = 128.6 kb, and b2 = 95.4 kb, 1 – c2 = 0.25816 –> 25.8%. That’s twenty-five percent decrease. If someone offers you 25%, you say,”Yes please”. Now, the issue with the no-chill label photo is the resolution. This step is just supposed to be a final “trim” before actually uploading. I haven’t really developed a ‘style’ for how embedded images should go on this site, but seeing as the default resolution values here are width=”300″ height=”169″ that seems like a good enough start for now.

=====

There are a nummer of paths I could take right now but… https://www.imagemagick.org/Magick++/tutorial/Magick++_tutorial.pdf Scaling, rotate images, Bezier curves.. the wrapper library is strong with this one.Although, they flat out tell you (page 3) that some of the library functions don’t work.The better side of me tells me to avoid this library, but I already started reading.

The word ‘daunting’ was used. CML processing is simple though

That should be enough, and maybe

is enough to nuke an entire directory of photos. With the Magick++ library the function call would be

If the zoom_factor value is < 0 it zooms in, and zoom_factor is > 1 it zooms out, the image changes in resolution, but the aspect ratio pixelX:pixelY remains the same. Humm.. just for fun, let’s just go with the .cpp. Okay, here’s the plan. Install the library, and then make a makefile to.. Or, just add the library to the path, and then I don’t need to link it, er… Installing is technically the same as adding to the PATH. The MakeFile will do the same job as linking it. er…

So, yes, imagemagick is stilled on this OS, but the debian FTP version isn’t even upto date yet.
via https://packages.qa.debian.org/i/imagemagick.html CVS

but is it really fixed? Until 8:6.9.9.6 gets updated, no it’s not. They just disabled webp on that version and moved on. I admit, I’m not even using this feature, but if I take anything from this it’s that Debian won’t have upstream versions of most programs, and even now they have plenty of security issues to be concerned about. I guess I should glad they still have people looking after it, at the same time, “free software is worth every penny you pay for it.” For now, debian has version 6 offered, the makers of imagemagick are on version 7.

https://www.imagemagick.org/script/porting.php

Apparently now everything is ‘channel aware’. With version 7, commands like “convert” were replaced with simply “magick.” See above. A quick manual resizing trial quickly reduces that 2.6Mb photo down to under 25 kb. That’s significant. Now to automate it.

I want to create a c++ script that will take every *.jpg in a directory and create new resized versions of it in to x and y pixels. The program will ask for a lengthX in pixels, and an aspectRatio. The lengthY will be assumed.

Translation: old…

So… the authoritarian repo is on github. If i recall correctly,

but, remember to cd in the directory you want to copy this in. And that takes a minute or two to finish. I’m in my programs folder, and I noticed I have Krita installed on there. I’ve been “meaning” to use it at some point. That was about, 6 months ago? Here’s the URL if anyone’s interested in that. It’s actually pretty advanced. https://krita.org/en/

ImageMagick might take more than just a minute actually. 97.57 MiB, most be a slow connection.

ImageMagick/Install-unix.txt should have everything you need to install

We can skip the “tar xvfz ImageMagick.tar.gz” step since we basically just git cloned it.

vim configure then GG tells us it has a little over 41 thousand lines of code, meh. just pretend there’s no virus in there or w/e.

Back to the install.txt, oh god. this isn’t full automated. I actually have to read through this.
Okay, so inside the repo

then skip to line 497 and that should be the rest. Maybe that was just a scare. If there’s anything to check for it’s the line

It should be on the output table by default. This is the feature we want to use for the c++ class wrappers. The installation seems typical now. Next is just

As I microwave my dinner I verify the integrity of this install. At this point the old identify -version command above spits out a cannot access error.
make check
tells me ImageMagick 7.0.7 is installed and nothing fails. 86/86 passed demos.

but
display  failsi
identify -list configure
fails also

the system isn’t able to reach shared library. the exact output is

some help:  https://gist.github.com/wacko/39ab8c47cbcc0c69ecfb

Again, i’m installing an a debian system, and it seems to be that the imagemagick folk have things ready for solaris/unix machines. So yeah, let’s just do that.

and wow, it works. I don’t know why it works, but it works. <— most said line ever.
What is ldconfig? Well.. man ldconfig reads,
ldconfig – configure dynamic linker run-time bindings
Alright, I’m going to eat dinner, and then I’ll do a test run on the c++ wrappers.

oh, and btw.

spits out

so, woot woot!

With it now being tomorrow, I’ll call this an end to this “live” blog.

====
If I read the word, “automagically” one more time I’m going to puke.
So, I’m going to have to make a makeFile for this.

Inside /Magick++/bin there’s a “Magick++-config” script. I’m a little confused as to what it does exactly. Is the useage output suppose to change based on the machine you’re using? Is it just a little reminder script made to bring out system variables to tell you how you should link the headerfiles? Humm.. why are the flags in backwards quotes? Backward quotes inside “double quotations” are evaluated and expressed. Ahh..

so ./Magick++-config is a shell script. The variable usage is a string variable. The batch script goes through an if, fi test. If the previous command is 0 (implying a clean exit) then it echos the string variable “usage” and it exits. The $# variable is peculiar though. The first few lines of the script are below. Okay so, this would’ve been obvious to someone more familiar with shell scripts. Above, in “if test$# =eg 0; then” the test, is actually a program called test.

yeah..
Today my complier of choice is g++. By putting in the backquoted string portions into a command terminal, we get something that’s expected, path variables. Referring back to the Magic++_tutorial.pdf linked above, in page 3, under the subtitle “Using the Magick++ library in an application” The purpose of this script file is to help you get the LDFLAGS for your system.

## draft: notes on vermincomposting

Tips and tricks for vc

stay away from acidic items, oily items, meats/cheese (longer to compost)

NO

orange rinds, citrus fruits (attracts fruit flys)

onions, broccoli (strong odors)

salty or acidic foods

avoid grease, fat, bones, fish, and meat scraps. Fats are slow to break down and greatly increase

the length of time required before compost can be used.

Avoid diseased veggies, diseased anything.. compost will heat up, but microbes will still be there after years.

Weed is a problem. If it hasn’t developed seeds, it can be thrown into the bin, but weed seeds can live through the compost cycle and show up in the garden months later.

Tightly closed bins are bad because air won’t go through.

vegetable seedlings are susceptible to microbes in worm compost, and it’s difficult to sterile it. Worm compost is not recommended for seedlings? (verify this)

does one need compost starter? How do you do quality control? What metrics am I looking for? Microbes in air could be enough to begin with assuming air circulation exists in a bin.

Eggshell exception. Wash and dry them out. Do not feed the worms eggs, but they do need calcium. So dry out the egg shells and crush them. Caveat.. save your egg shells.

don’t drawn your worms. Include holes in bin, but can’t leave them outside since las vegas heat is about 115F in the summer time, which is about 35 degrees too hot. Around February the ground surface will get about 45F in the morning.

YES

raw veggies

newspaper strips

to increase population: sweet fruit scraps like banana peels, cantaloupe rinds and apple cores, and pumpkins! (jack o lantern)
old vegetable material, coffee grounds, tea bags, old newspaper, shredded documents, and fall leaves

leaves

bin will need: moisture, air, food, darkness, warm (not hot) temps.

Bedding from newspaper strips/leaves retain moisture + contains air space (does ink harm the soil?)

they also eat bread and wheats

mix scraps with the veggies

sprinkle bedding with water until damp

Composting occurs best at a moisture

content of 50-60% (by weight) there is a ‘squeeze’ test

grass clippings (who has grass anymore?) have nitrogen.. do not throw it away they will decompose fast. But don’t allow them to clump. If they’re big, grind them down. they’re very moist though, and should be mixed with a dry substance to avoid become anaerobic

TYPES OF WORMS

red worms or red wigglers

Eisenia foetida

Lumbricus rubellus

Nightcrawlers are large, and need room. They won’t like bins. Some are naturally deep dwellers. Deep as in 6 feet. Red wigglers are okay with 6 inches.

they’re surface-dwellers. Prefer to live in the top 6 inches of soil

thus shallow bins rather than deep is preferred

for wood bin, line button with plastic? Cover with loose fitting lid. To allow air into bin

after several months worms need to be separated form their castings which at high [concentrations] create an unhealthy environment for them

to prep for harvest. Do not add new food to the bin for 2 weeks.

Make compost tea: just add water, let it step/stew w/e for a day. Then add “tea” to your plants

 Name Common Common name Lb/day Temp low Size Life span Lumbricus rubellus Angle/leave/garden/drift/red march worm 38-40F 4in Cold weather.. Eisenia fetida Red wigglers 2lb eats 1lb/day 1.5-2.5in, 1lb to 2ft-sq 2-5 cocoons/wk 1, hatch in 1.5mo, lives 2-5yrs Eisenia hortensis European night crawler, belgain night crawler Larger than ^ Cooler temps, moist environments. 1.5yr to settle into composting Esenia andrei Tiger worm Unpleasant order?

Temps above 85F are harmful to worms. They will try to leave areas that hit 95F. If they can’t escape, they will die. Proteins will denature, etc. (can worms get heat strokes?) Above 80F will slow down the decomposing activities. Worm cocoons can survive freezing temperatures for a couple of months even if their parents die (how sad.. :’(

Alternative title: Blogging the Red Books

While reading this, I though, “Why not?” Here’s a list of in-progress books and other books/writings I’ve come across in the past that I simply should share with the whole world wide web. Not being a fan of 3rd party tools that harvest and conveniently classify your information, here is a good enough place to put it for anyone who cares enough to know what is on my mind.

In progress:

Insup Taylor and M. Martin Taylor
ISBN: 0-12-684080-6

I got lucky and found a hard-copy of this book for 3 dollars 2 years ago. It’s been sitting in my shelf that whole time and afterwards I’ve been using it as a stand because the hard cover binding was the perfect width for one of my projects. I purchased it just to read a few sections that caught my attention when I first looked through it, but when I actually started reading it from the start it seemed to me to be something very fundamental. I read somewhere (probably this book) that language shapes the way you think, and this book goes over not only roman based languages but also the ones form the Eastern world. I warn people never to trust a biology book that was published more than 10 years ago, and physiology is applied biology. So, I don’t know. This version was published in 1983, way before I was born. I’m sure there’s going to be some outdated information (or even entire theories), so I should keep reading it with the idea that some of the stuff in it might not be true anymore.

n/a

The Information: A History, a Theory, a Flood
James Gleick
ISBN-13: 978-0375423727

It’s a simple enough read, but it’s structured on themes more than it is chronologically. For example, he keeps referencing back to Shannon in almost every other chapter, but I guess a person can’t write a book about information without mentioning his work. Geick’s limits himself to the decimal definition of a kilobyte and ignores the binary definition of 1024 bytes. This leads me to believe he would also be the type to pronounce gif as jif. But really, it shows how in depth he gets into the technical subjects. The section on qubits was elementary at best, and I assume he included it in the book mainly to claim full coverage. Upon finishing, I couldn’t really figure out what the thesis was to this book. Was it about information overflow? The epilogue sorta left me hanging. I learned plenty on the history side but not so much on the theory side. The flood portion seems rushed at the end. Unless I missed something, Geick seems to have really just dedicated one chapter to it. History was the stronger side of this book.

X is just another way for Y to make more of X
#bookSelfie

I don’t mean to be bias with the stuff I post here, but I found that the best books I’ve gone through were the one’s written by folks with Ph.Ds. Maybe it’s because they’re experts on the topics they write on, or because they’ve had 100’s of hours just doing what “writing” is. Therefore, they portray their ideas better than the layman. I don’t even want to give mention to the worst books I’ve come across mostly due to embarrassment of proclaiming I’ve picked them up once upon a time, but you don’t always know that until after you’re more than half-way through the book. We shouldn’t propagate bad ideas, but at the same time we shouldn’t block ourselves from ideas/thoughts we disagree with. It’s a struggle.

To be updated..

## Back to the Basics: Crayon, wp-LaTex, and vimWiki

### Before I forget, I should review some of the basics.

So, this is a blog, and having your own blog requires some maintenance. It’s like owning a house except the servers are hosted in Arizona, the domain name is registered in France, and I guess you’re not owning said house. You’re renting hardware space in someone else’s basement.. Note where your local and remote machines are. Okay, so I need to remember how to bash function. How to bash function? No, how about just a bash alias?

while we’re at it

but the alias needs to be remembered after reboots

include in file, continue with life.

So with this, instead of memorizing a very long username + host domain I can access my blog with only a few magic words.

.. checkmate. I guess we could include the password as an argument but, no. So long as remote and host have authenticated each other before, this ought to be enough.

Now, without needing to activate FTP we can update and modify everything in the blog via command line. Will need WP-CLI. Fortunately the freebsd log files kept track of the commands I issued to it oh so many seasons ago, so simply pressing the up arrow key and holding it for a minute or two sent me to the keystrokes. the magic keywords are simply:

remove/deactivated unneeded plugins which could become hazardous to your health. For more information these functions have been well documented at http://wp-cli.org/docs/ or simply refer to the man pages

### How to syntax on wp?

since we’re going to be typing code here and there, a nice plug-in to have is:
https://wordpress.org/plugins/crayon-syntax-highlighter/

from here we can simply dl to our own machine or even use SSH from the remote machine to dl the file from a selected URL https://downloads.wordpress.org/plugin/crayon-syntax-highlighter.zip and then save it to the remote machine’s plugin directory, then..

via wp-cli
search for “crayon” to get a list of plugins already available through official channels

So now we have pretty colors that support languages beyond what the regular vanilla wordpress supports. you could also git clone or wget + unzip from https://github.com/aramk/crayon-syntax-highlighter.git to get the same affect, but we’re already using wp-cli to run an update so might as do it from the remote host.

### Next order of business: LaTeX.. how to LaTeX?

simply typing in:

should produce:

$latex i\hbar\frac{\partial}{\partial t}\left|\Psi(t)\right>=H\left|\Psi(t)\right>$

Unfortunately, I won’t get to see it until I preview it on the wordpress had it even worked to begin with. Apparently a viable plugin would need access to a latex server in order to generate some sort of image. It could be a .png and there’s even one that generates .svg vector images, but it pings to some guy’s web basement server and I’d rather not have my blog reliant on whether or not some random guy forgot to pay their electric bill. Of course, there’s always the option to self host a latex server on my own or a remote machine. A search for latex on the plugin menu yields several attempts at doing this.

While searching, I discovered an old .png generator for latex inputs. It seems to be not-so-old and the current method from which wordpress.com uses to render .png images. You can play around with the below URL.

So, there’s a wordpress.org plug-in that utilizes a  similar, if not the exact same generator from wordpress.com. Note the distinction between the .com and .org service. It also has an option to generate from a self hosted server. Humm..

pro: less bandwidth
con: if access to wordpress.com goes down, image rendering capabilities go down with it. This will happen several times a year depending on who’s attacking the DNS.. but if the DNS is down this website will probably be inaccessible as well. Except in Australia. There’s a story to that.

To be reliant on external services.. Sure, to an extent.

for more details: https://wordpress.org/plugins/wp-latex/faq/

### how to vimwiki?

I’ve been reliant on gnote and random .txt files scattered around my drive to write down random things. A friend of mine recommended vimwiki, almost randomly like he had been reading my mind (or had shell access to my laptop).

The vimwiki is more of a plugin than anything, and should be easily installable through pathogen.

You can find the installation instructions here: https://github.com/tpope/vim-pathogen
and for vimwiki: https://vimwiki.github.io/

I shouldn’t need to go through writing the instructions here because the details are literally right there.

What I did need to figure out was that I was expected to make my own index.wiki file inside the ~/vimwiki/ folder. I was surprised when I could “enter” and “backspace” between files joined together by a link. #jawdrop.

creates a folder with a html transaction (along with a .css) of the page you’re on. It doesn’t follow through the nested links though. With maybe about a 100 functions on that help page, it’ll take a while to get acquainted with this program. I think it’s safe to say it has more functionality than what I had been previously using. I’ll play around with this for a few weeks and then decide if I’m better off with it.

What’s next? humm… I have less than 4 days to prep up on my Korean, toy around with some java, and I think they’ll be putting me in something about a “smarter cities” think group. I suspected it was related to the CES hackathon but that’s this weekend. I should probably go to that, but I need some ‘me’ time and me time includes prepping up on my Korean and toying around with some java. Also, my friend tells me I need to get more sunlight. I do.

 Had been invited by some friends to join them at the CES hackathon this weekend, and despite making an attempt to print out my ticket, I just received an e-mail specifying that entrants needed to be registered for the CES portion by 7 days ago. This would be a separate registration process, and they’re only letting the first 450 people in. It’s fine. More ‘me’ time. Also,

will attempt to change all your black pixels into blue ones in case your printer ran out of black ink. Sadly, I also just found out that we have a Kodak that refuses to print if it detects the black is low (or refuses to print black if it detects the color is low). Hardware should not be used like this against the consumer’s interest. Alternatively I could’ve gone to a UPS store or library that opens early to get something printed – mobile ticket barcodes ftw.

On a lighter note, I finally received the schedule for the Ajou workshop on Monday, so it looks like I’m slowly placing things on that to do list.