# Introduction (view-source!) <p style="margin-top: 25px;">Hi, this is my daily work journal. Although you're certainly invited to read it, there's really no reason unless you like wading through the random thoughts of another person. In other words, there's a really terrible signal-to-noise ratio here -- kind of like a random compilation of notes for a book. Twist is, I'll be "extracting" the book through a series of data transformations and queries. Often, the two are the same. I will also be refining this file, adding meta data and other things to slice and dice output and results in different ways for different uses, devices, etcetera. Tools matter, Linux, Python, vim, git, yay, yay, yay! SEO blah, blah, this page is a one-page website experiment, plus a sincere journal and exercise in Cialdini's principle of commitment and consistency. Ex Amiga fanboy. More within.</p> ### Places you'd probably rather be: <ul> <li><a href="http://mikelev.in/">A More Organized Place to Learn About Me</a></li> <li><a href="http://mikelev.in/2011/01/python-programming-language-advantages/">Learn a Programming Language You'll Love</a></li> <li><a href="http://mikelev.in/ux/">Run Old-School Linux Instantly</a></li> <li><a href="https://www.instagram.com/miklevin/">All The Pretty Pictures</a></li> <li><a href="https://www.youtube.com/user/miklevin">My YouTube Channel</a></li> </ul> ### I begin my day by asking myself these questions: 1. What's most broken? 2. Where do you get the biggest bang for the buck? 3. What plates need to be spun? BE WHERE MANY OTHERS WANT TO BE, AND HELP LEAD THEM THERE. ### In "Striking Distance" Projects I Try Advancing Every Day: - **FirstDoser** - Give people "free value" to build trust with audience - **ZipUpEvents** - Link to my YouTube & Github publish events near same dates here - **VideoFolio** - Organize my YouTube videos into galleries on MikeLev.in - **Mydentifier** - Linking together all my channels for audiences to self-sort - **MediumWell** - Extracting journal entries here as draft articles in Medium.com - **MailMinion** - Email digestion, posting here, dispatching instructions, etc. - **AntiCloud** - Defend against loss using services and a personal data-centers ### List of Projects I'm Deferring - HOME - New Eye Exam - Bifocals? - Make dentist appointment - First 80/20-rule Pass on Any Given Room - Switch MikeLev.in comment system to FB - WORK - Daily Report Expansion # Beginning of Journal -------------------------------------------------------------------------------- ## Mon Aug 29 11:04:00 EDT 2016 ### Monday Okay, getting ready for today / this week. I know I have some git entries at home to still push out, so that'll be a merge from home. That keyboard at home is really freaky. What a birthday, too. I really love spending my time with Adi. Her and my interests are so totally aligned. But I do have to dramatically clean up on the home front. I should really get into the mental state at work preparing myself for when I get home -- hitting the ground running, and hitting it hard. Mon Aug 29 17:46:16 EDT 2016 Lots of talk today on the Google announcement about penalizing modal interstitials starting January 10, 2017. -------------------------------------------------------------------------------- ## Sat Aug 27 16:47:07 EDT 2016 ### Puttering With Rainbow Ripples on my Razer BlackWidow Chroma Clicky Keyboard This is turning out to be one of the most interesting and best "textured" days I've had in a good, long while. I'm looking forward to my Twelve South BookBook wallet getting distressed and broken in. I already dropped my phone fiddling with the fob to unlock a door until I realized I didn't even need to open my wallet to use the fob. Then, I discovered my lost keys. Good feeling. The time has come, the Walrus said, to get your ass organized. Use your time when you have it, and that means when you're with Adi, to get your dumb ass organized. One of your (and by your, I mean my) problems is that you feel too much like you "made it", no matter how hand-to-mouth you're actually living now, money-wise. Adi still managed to steal my chair in this room (not my computer), so I'll go get a kitchen chair. Wow, this is really an awesome experience. No matter it contributes to getting into the flowzone. Yeah, is it "the zone" or is it the "flow" or is it the groove? I hereby label it for my own use the flowzone to use a bit of redundency to eliminate any question of what it is I'm talking about. These keyboards contribute to getting into the flowzone by constantly rewearding the animal part of your brain while you type. Sat Aug 27 18:16:19 EDT 2016 Things still going pretty well. It's almost spiritual to have the keyboard rainbow ripping in response to typing... and I am actually puttering, doing an 80/20-rule first-pass organizing of the apartment. It's amazing. Whatever room Adi switches to, I find something to do there. I think I'm on the verge of a re-organizing, maybe using Adi's wardrobe for my own use, and then giving her the half-wardrobe, and maybe even the Rubbermaid $70 shelves from Amazon. That would be great for getting her stuff out of the way. I think she needs closable cabenits now more than she needs wardrobe space here at this apartment. I'm inching back to not being ashamed of the way I live. It is my birthday present to myself. The difference between winners and losers is finding a way to find the time, regardless. A lot of that is optimization, born of a higher-level abstracted understanding of experiences and improved general set of reusable solution patterns being at your disposal. -------------------------------------------------------------------------------- ## Sat Aug 27 13:55:58 EDT 2016 ### Happy 46th Birthday To Me - Razer BlackWidow Chroma Clicky Keyboard Extreme Happy Birthday to me! This is my first time typing into my journal on my new Razer BlackWidow Chroma Clicky keyboard. It all started when I realized I wanted illuminated keys on my keyboard at night when I typed, because I wanted to do more of that "vampire time" type concentration-time that by boss talks about. Problem is, I'm exhausted and not motivated to do that sort of thing when I get home at the end of the day, and knowing that it's so much about mental attitude that comes about by shaping one's own environment, and tricks to trigger off different happy chemical neurotransmitters. It is so irrationally tied to environment and stimuli. And so, the silly Razer keyboard. It's one more set of slightly strange and different key arrangement to get used to, but there's big reward. I can't wait to actually figure out the LED programming that helps me most late at night. Right now, I'm typing on a ripply rainbow. What I want for my birthday is to putter around the house all day and gradually get my place into not-objectionable status. That's merely a bunch of 80/20-rule sweeps, while having peek energy and without Adi pulling me away as soon as I start to do something that really starts to help. Wow, this is an awesome feeling. Being on Apple desktop hardware really isn't bad. They got to the superior UI like good full-screen, just like usual. Proprietary isn't always that bad if they're working their asses off to give you the best product imaginable, and sometimes do in fact know what's best for all of us. Someone at Apple remembers what it used to be like being on the Amiga computer, and is trying to deliver a semblance of that experience to use through OS X -- soon to be once again, MacOS. Good! You can feel the subtle compass-less cluelessness of Android and Windows, just aping Apple at every turn, often getting the so-important tiny details all wrong. Shame on me for throwing 3 years at Android in the vain hope that it was just the way I saw it, and not really inherent inferiority, due to the chronic pandertitis that plagues everyone who things they can out-Apple Apple. Maybe they should focus on out-Amiga'ing Apple instead, so they can generally get out ahead of where Apple seems to constantly be trying to go. Even to this day, to switch between full-screen apps, there's no hard-and-fast equivalent to Amiga+N. Some might say Alt-Tab, but that's only task-switching on Windows. Some might say the multi-finger swoosh on an Apple Trackpad, but that's only on Apples with trackpads. I see that Apple Key + arrows is doing it for me right now. Maybe I'll try to commit that to muscle memory for when I'm using this keyboard, because moving my hand away from the keyboard really does throw off the groove. It's amazing thinking how much lost productivity for advanced people came with gained productivity for beginners, thanks to the mouse. Okay, Adi followed me in here of course and demanded Gravity Fulls on the very system I'm typing on with my new keyboard, and I held firm with a "No" and that part of my birthday present to myself is to be able to type on this keyboard, and that I could set up another screen in this room for her to watch on, and she was fine with that. I set her up on the Microsoft Surface Pro 4, which now has a smashed corner, but which doesn't interfere with watching the screen okay. I may try to fix it myself at some point, or maybe just see if I can just pay to have somebody else fix it, or better yet, just be happy with it for a few years until I'm ready to buy something newer and better -- yes, that's probably the most likely. But I DO want to start doing some more do-it-yourself broken screen fixing, like of old iPhones and stuff. Okay, but back to the order of business for today -- puttering and organizing... ALL DAY! Wow, it's been like 10+ years. And in the end, days like this are what make all the difference. I think getting through to Adi is helping too. -------------------------------------------------------------------------------- ## Fri Aug 26 09:45:17 EDT 2016 ### Today It's definitely very odd and weirdly satisfying to have my phone as now an integral part of my wallet, with my new Twelve South BookBook wallet. It's going to take some getting used to, but very worth it. Adi will have some adjustment to make, me not being on Android anymore. I like Apple for my main phone, and I caused myself unnecessary pain and anxiety for long enough forcing myself to get familiar with (and used to) Android. Game over. Fri Aug 26 12:02:34 EDT 2016 I'm having a great deal of difficulty "finding the love" in my current project, mostly because of the alternating between Python 2.7 in a Kubuntu VirtualBox and Python 3.5 in Jupyter Notebook... well, anywhere else (Mac, Windows, etc.) Fri Aug 26 16:30:25 EDT 2016 Okay, heading out now. Did a big-ish deliverable for a stakeholder today separate from the report update. I think I need to get a bit of distance from the report for some clarity. The debug mode that I'm adding now is key. -------------------------------------------------------------------------------- ## Thu Aug 25 13:22:17 EDT 2016 ### Have to convert Python List to Dict with Win/Loss Scenarios Ugh! Okay, time disappearing again. Buckle down and FOCUS! Love what you're doing. I have a more usable DEBUG mode in the reports now which lets me set a whole set of values to keep file collisions from happening in the non-debug file locations. So now I can run this thing freely over and over, but each time takes a VERY log time, especially if I'm generating all the files from SQL every time. Okay, I can see where my logic has gone terribly awry. I need to create make the keyword_dict that uses URLs as key ACTUALLY grab the top-performers and not the last encountered in the loop, as I currently am. Sheesh! I totally knew that issue was coming, and I forgot in my initial implementation attempt--stupid! Maybe I just reverse the sort order of the original list before the loop! -------------------------------------------------------------------------------- ## Thu Aug 25 09:22:09 EDT 2016 ### Efficiency Lost Handle today much more efficiently. I was terribly inefficient on this last report update. Allow yourself a few thoughts before totally throwing yourself into focus-mode. I am falling behind on several fronts in life, but simultaneously pulling ahead in others. The trick now is to pull ahead on all fronts, and that's a bit tricky given how much I'm "giving my all" on both the professional (SEO, Python, 2+ hour commute every day etc.) and Adi/Daddy front (weekends, Catskills, nightly calls) that I basically have no time or energy left over for the rest of the essentials. But I'm taking steps to fix that now, going into my 46th birthday. Not a lot of professional years left, and I'm getting into those years that when I think back, I KNOW I remember with my dad. These were the most impression-making years I believe in forming that higher-level abstraction-thinking adult. This is where & when those adult-thinking abstractions are formed, and I'm going to be the coolest dad ever -- right as I do right by my employer and earn well and get myself financially responsible and back on track. But first, get a few things to get my head into the right place. There is a feedback loop between environment and mind. Get that feedback loop to be working in the positive direction -- and THAT means a Razer Chroma keyboard for myself! And a few other things (magnets, pocket microscopes, magnetic silly putty, solar cockroaches, buckyball magnets, etc.) -------------------------------------------------------------------------------- ## Wed Aug 24 14:26:09 EDT 2016 ### Paralysis... Un-Paralyze! There's so many utterly terrible way to go about this next step on the report updating that this may be the reason for my seeming paralysis. It seems like it should be easy, but everything as it exists is such a balanced creation, that to go in and start doing stuff is to risk collapse. Wed Aug 24 18:55:00 EDT 2016 I think I've got it. Running a test. Long-running Gonna head home now. -------------------------------------------------------------------------------- ## Wed Aug 24 09:31:12 EDT 2016 ### Switching from Android back to iPhone Love my podcasts. Have to think of more beyond the Python, and even the programming ones. It's also strange being back on the iPhone as my main platform. I think after 3 years with Android, I've drawn out of it most of what's valuable about it. I think it amounts to: - Multitasking - Knox (on Samsung platforms) ...and not much else. All that "freedom" stuff people talk about is patently false. It's amazing that anyone gets anything done on Android. It's subtly distracting and disjointed. Apple's iOS has its problems too, such as sandboxing apps to an inconvenient level, but I see why, and I appreciate the results of it. Okay difficulty focusing comes from difficulty focusing -- NOT environment. After a bunch of jobs now, I see that my focus really amounts to my LOVE for the work I'm doing, and my ability to filter everything else out that's going on, and sincerely just being "transported" to a different place -- and THAT'S not easy. The legitimate complaint is the number of things that jolt us OUT of that strange zone, and not the process of getting into it. - The responsibility for getting INTO the zone is on us. - The blame for jolting us OUT of the zone is on whatever allowed that jolt-vector to exist, be "turned-on" or whatever. And so... and so... yes! I got it. vim is a pathway into the vector, but one that comes equipped with its own warning of how difficult it is to get and stay there. I'm experimenting with music again. Now that I've listened to podcasts and audiobooks for awhile, I'm beginning to condition myself to "just listen" and "not listen" all at the same time -- something I've always had difficulty with. My hyper-literal conscious mind has had quite a difficult time with music all my life -- I think that's why I love the sort of things that are rife with lyrical brilliance, like Disney's Be Prepared. I just have to let the rhythm carry me on occasion. Adapting to different platforms rapidly, and living happily within their confines, is incredibly important. Don't balk at the restrictions. Embrace them, and make them yours. Make other people look in admiration at what you can do, no matter the adverse or strange conditions. Okay what you do now is shape reality moving forward. Learn your lessons from all that stuff you've been reading about the mind and quantum reality. I'm doing it one way as I type in vim, and that's why it's a transitional activity... a catalyst, and the reason that whole 1, 2, 3... 1? thing is so friggin' important. Wed Aug 24 13:28:03 EDT 2016 Still delaying... shit! Have an accumulating set of work to get done, but hit that invisible wall. Time to plow through it... because I'm a fucking human. I can control a lot of aspects of my behavior, and where I can't directly force myself into certain behaviors, I can do life-hacks to compel myself to at least start down the right paths. Small decisions can have huge consequences. -------------------------------------------------------------------------------- ## Tue Aug 23 16:33:34 EDT 2016 ### Productivity Issues Today NOW! 1, 2, 3... 1? I'm gradually losing my ability to focus today. I need a better night sleep than I got last night. I need to think through my challenge and do it cleanly and efficiently. I need to look at the problem anew, knowing that I have this big, wonderful tens-of-thousands-long keyword list to work with. I need to start getting my life into better order on the home-front to better accommodate things on the work front. Consider heading out early enough so that you have some energy still when you get home, and do a big, wonderful 80/20 rule pass over your place. Lock-in on the refreshed and renewed interest in organization and treating things well that this new iPhone 6SE is infusing back into you. It's been a long time since I've had this feeling. iPhone got SO MUCH right, and about all it seems that Android got right is multitasking and Knox, which is not even an Android innovation, but a Samsung one. I will have to unlearn a few Android habits, but it won't be hard. This was a strange 3 years, and the few things I used the Stylus for was totally not worth it -- except maybe insofar as making sure Adi is multi-lingual, such as it were. And she is now. I'll figure out some way to give her what she wants (her Panda games). Maybe I'll make that a car phone. That'll be a good idea. Gotta get that cracked screen fixed on the Note 5... maybe. -------------------------------------------------------------------------------- ## Tue Aug 23 10:04:26 DST 2016 ### Done With Platforms That Subtly Add Stress Wow, it seems like forever later. I took Monday off and spent it with Adi. I didn't do a single journal entry over the weekend, I think. I just switched back to iPhone, and am now on the iPhone 6se, interestingly enough. I just love this small form factor. I will not have to carry aroud 2 phones again, and I'm going to be on the platform which is my real preference (if I'm being honest with myself), anyway. And I just now ordered a leather wallet case for it that holds credit cards and money. Tue Aug 23 14:35:52 DST 2016 Had a good meeting with one of the awesome stakeholders here. Gotta think about the rest of tday. It's already after 2:30 PM and I'm slipping further behing. I believe I need to start exercising discipline. Get myself back into the flow and the zone. Just not... not... not in the correct mindset. Oh, also have some help to offer a department that sits next to me on preparation of content for an upcoming show... okay, touched base with them. Now... now... think! Clarity. Nah, clarity not required. It's the 1, 2, 3... 1? thing again. Okay, for starters, get the hell off of this Windows 10 virtual machine under VMWare Fusion that I'm using merely to test the Ubuntu Bash Shell under Windows. I've tested it. It's great. It's going to make Windows usable without Cygwin. But now, stop using it. It's a distraction. You are constantly replatforming, and its a source of stress, and the pure Apple stuff works just so transparently and without additional cogntative burden. You've switched from Android (Marshmallow) now back to iOS. So now embrace it. -------------------------------------------------------------------------------- ## Fri Aug 19 11:55:08 DST 2016 ### Refactoring SEO Pulse Reports I still have tons of work to do today, and a 1-PM call coming up. Go grab an early lunch. Be fueled-up and ready to hit this thing hard. Light meal and plenty of hydration and caffeine. Okay, got my food. This feels pretty good. Asides from the dissociated spell check file (which I'll fix), this totally could be my day-to-day journaling environment. Okay, focus! 1, 2, 3... 1? Just change the way the dates are being set in the forked copy of the reports repo. Okay, I got the date system ported over to the new way. That was a nice round of cleanup on the report.py file. Next! 1, 2, 3... 1? Ugh, I'm actually going to really clean up the keyword section of the SEO Pulse Report to be the super-set keyword list I need everywhere else. I mean because like, why not? It's the super-list. I just need to keep the database field names identical! Ugh, no! I'm looking at the logic, and I pulled off some really amazing stuff there with joining multiple tables and rearranging columns just-so. Ugh, okay let me think. I would really like to gut that area of the report and rebuild based on the new logic. I mean like, why not? It's the same list (essentially) that I need elsewhere and I have the logic essentially written AND MUCH SIMPLER! So, why not? Gut first is the rule. Don't try to figure out what you WERE doing, but rather just go for the new method. Use it here as the new Keyword table, and then join IT'S results to the other URL tables. Hmmm. Okay, remember these column names: - keyword - positiondy - positionwk - positionmo - position90 - clicks (for what time period?) Okay, I'm hopeful again. If I'm only displaying those few columns for the keywords table, then I can keep those column names exactly the same and in exactly the right order... ohhhh, I may need to execute a whole bunch of foreign table drop and create commands. -------------------------------------------------------------------------------- ## Sat Aug 20 15:14:22 EDT 2016 ### Found A Guide For The Perplexed The law of eventual return. If it was meant to happen for you, it will. It might take awhile, and you have to do more than your part of the work. But if you keep it up, and are intelligent enough about it, you'll probably get there by 50 or 60 years old. And if not, it was fun trying, huh? But it can't always be... tough luck, Einstein. No grand unified field theory for you. But who wouldn't want to be along for that ride... uh, yeah, right okay sorry. Anyhoo... solipsists are ultimately correct (not existentialists, after all) because you can't disprove them, due to that pesky problem of induction. And so, why not go for the gold, in whatever event you enjoy doing? Seems like as good a way as any other to pass your time. So glad I read Iain M. Bank's Culture series. Now, off to the Catskills to meet Adi. Oh, one Audible tape update. So burned out now on anchors and cognitive biases (Thinking Fast & Slow) that I need a dose of... Quantum, A Guide For The Perplexed -------------------------------------------------------------------------------- ## Sat Aug 20 09:15:11 EDT 2016 ### Ending Your Turn Fairly Satisfied Back in vim. I feel really comfortable typing in vim. I wish I had this skill starting from like 10 or 12 years old. Things to keep in mind when trying to take up the new skills that really make all the difference: - Yes, the ways of the new generation often ultimately replace (trump) the old... but... the ways of the old generation are that way for reasons worth knowing (like, how do those robots work and how might we turn them off?) - The ways of the old generation may be so lost by now that even their creators no longer see their connection to the modern world, such as with the creator of the vi text editor, Bill Joy. Efficiency always still gives advantage. - What comes around goes around -- always. It's not like society will have to reboot from total collapse, but it's nice to know how you could. Knowing things down deep helps when things go into flux, as they always do. - Things are so complexly built-up these days, that they have become, as Arthur C. Clarke once said, indistinguishable from magic. In general this is fine for improving human quality-of-life, but must come with a warning. - While inserting extra abstraction-layers is fine and necessary--nature itself favors this trick--that experiential trickery we (humans) have imposed on our own world will too easily become dangerous prisons, the moment we forget. - There's a few really important (fundamental) world-hacks to know. How transmitters and receivers are the same thing is one. The two-slit experiment of quantum mechanics is another. It is always important to know these things. - In life, in general, especially in your times, there is just too much to know. You can't know everything, so get the gist of what the latest in physics and grand unified theories tell us. Then, focus on whatever else you like. - Relays, vacuum tubes and transistors all help produce "digital logic" through a reduction in the actual richness and infinite depth of our physical world, and should not be over-valued as an apex of technological accomplishment. - The world's not digital. Nor is it even binary or bound by tight, confining rules but for a few, like the-speed-of-light. Everything else is made-up human mental constructs and should often be challenged. - Much of this reality-challenging is being done by particle physicists through mind-numbingly boring experiments of smashing atoms with large atom-smashing colliders to see what happens. Though important, it's just one route. - Big world-shifting changes do occur just as a result of thinking. People have these thoughts all the time, but do nothing about it. Some who have done something about it include Copernicus, Galileo, Newton, Plank and others. - If you think hard enough, are well enough educated, and have good communication skills, you can make it possible to blow up the world and end all human life. A man named Leó Szilárd did just that, and showed Einstein. - We survived people having those amazing world-changing thoughts, but just barely. It's like when Sammy clawed you near your eye. Humans are a lot like infants, Sammy like the world, and tail-pulling like chain reactions. - This should not scare us. It should just drive us to be cautious when we invent, and have good friends to look over our work, give us opinions, and chime in at the critical times (as with the Szilárd-Einstein letter to FDR). - Looking at the same things as everyone else, but coming up with different results is one step in the process. Persistence, tenacity, confidence, and exercising good communication skills are most of the other steps. - At this point, you should watch Richard Feynman, particularly his Caltech lecture series. Then, watch The Honeymooners. Now, ask your mother about old-time New Yorkers and your Great Grandpa Bernie. You're a New Yorker. - Occasionally, a step is really-really-really hard. If you find this to be the case, back off the problem and focus on something else (related) for awhile. Sleep on it. Come back. Hard problems often require a new perspective. - Taking a break is never a crime. Life itself seems to swing back and forth rhythmically like a pendulum. We go bizarrely unconscious every night to optimize and index our databases. Get a good night sleep... often. - Our bodies need a bunch of things to operate correctly. One is starchy carb-sugars to burn like slow-burning logs for energy (candy is tinder). The other is protein for growth and repair. Stay hydrated and eat diversely. - As beings, we are layered up, increasingly complex circles around an inner fish at our core, following the 4-limb amphibian design branch on the tree of nature. Rodents, social monkeys, us. "Us" is only a tiny veneer on top. - Most people are lazy and scared of new things. They are following closely in the foot-steps of those who have come before them, sort of running their lives on automatic, not taking full advantage of what it is to be human. - At this point, you should read the SciFi book, Dune. The movies are not good enough. Get the point of the gom jabbar. No matter how much you think you may be in control of yourself, you are only an animal, without crazy-discipline. - Because thoughts flow freely in an unlimited brain, but actions are inhibited by our bodies, circumstance, motivation and the physical world, there is nothing more important than deciding what to do next in the moment. - Many good thoughts that run through your head are momentarily considered and dismissed by the part of your brain that could do something about it. The big differences in life come down to these tiny "right now" moments. - But don't let that thought paralyze you. We don't (can't) live optimized and efficient lives. We are each just one in a long chain of billions and billions of similar humans. Just make good thoughtful decisions on average. - You chain-up these momentary decisions from the time you're aware of the fact that you're kinda-sorta in control of your own life, until the day you die. There's more, but that may be all that we as individuals may ever know of it. - It's all a constant building and refining and pruning and shaping of your own mind. Your brain holds a subjective copy of reality, like a virtual universe. I believe this is what makes humans most different from other animals. - And so, there is an often discussed "inner-world" in literature, stories, and much human thought. I once read one called The Master-Key system. All this self-help crap says the same thing, but it's true. - Imagine new things in your inner world. Don't just reproduce the boring old world as you know it, but experiment with new possibilities, plotting a course through life you think you might like. Take small steps towards it. - All this can give rise to notions of greater purpose, the meaning of life, and other unknowables. Don't go crazy over it. Someday, we may be in a position to know, but for today, just live well by your own standards. - Create your own standards, and dismiss others who try to judge you by their standards (control you). Much of what you think and believe comes from me and your mom, and are our own standards (not yours). Shed imposed limits. - Much of the point of all this stuff is to simultaneously feel good about yourself, your immediate circumstances and conditions around you (the things you can do something about), and to end your turn fairly satisfied. - Google belief systems like existentialism, solipsism and nihilism. Similar human experiences can make one person depressed, another pleasure-seeking, others mean and self-serving, and still others heroes tying to help everyone. - Not everybody has the same view of life as this, and many folks will try to impose their beliefs upon you (similar to what I'm doing here), but WITHOUT telling you to think for yourself. Beware of wolves in sheep's clothing. - It's not always easy to tell one type of person from another at first. This takes practice, and you should strive to surround yourself with people who bring out what you believe to be the best in yourself. - Ultimately, we humans (and most life) are self-contained, self-animating, temporary and reproducing units of energy/matter with lots of sensory equipment giving us empirical evidence of an objective common world. - The rest is up to you, and not up to all the bullies around you playing the exact same game as you. The only difference between you and them is a roll of the dice concerning genetics and initial circumstances in life. - I can't help but feel that despite the self-evident subjectivity of it all (google the problem of induction), the highest callings in life are those that on the whole help others without hurting. But that's subjective too. - The only other thing that I find self-evident in this whole "existing" mess is that the journey is probably the reward. We're beings bound-inside and limited by the very things we're trying to understand, so don't sweat it. - Stay in tune with your rhythms and the energy you're willing to put into things, then take on seemingly compatible challenges that push you a smidge beyond your limits. Rinse and repeat. At least, I find that very satisfying. - There is a beautiful, musical, dancing balancing of attractive and repulsive forces at play amongst the stuff that lumps together to form us and all the stuff around us. And at its most basic, we have very little idea what it is. - Watch Sagan's Cosmos, and take note of the cosmic calender. Also think how long a human life is end-to-end. That's how long ago WWII was. Super villains really did try to take over the world just that long ago. We're very young. - On occasion, there's persuasion that you ought to write insightful With an algorithmic rhythm we all learned from Doctor Geisel Intertwined communication that something something parameter To get across a notion that abiding our pentameter. >>>>>>> a8dfaefe90b5b25dac700e55de5440a59eeeb6be -------------------------------------------------------------------------------- ## Fri Aug 19 11:37:31 DST 2016 ### My First Github Commit/Push from Bash on Ubuntu on Windows Okay, here I am making my first journal entry on vim-nox from a Windows 10 Ubuntu Bash Shell -- amazing! It wasn't quite smooth getting here, but also it wasn't difficult. It just took a long time, and has all the same issues as settling into any new hardware (plus some extra download, install, reboot steps), but it's working full-screen and quite competently under VMWare Fusion on my old Macbook Air. I have it showing in Courier New with a nice big font, and my .vimrc moved over. However, I can't bring myself to do Dropbox again on a VMWare session after all that drive-shrinking, so I won't be sharing my spellcheck file with this system, which means I'm looking at a lot of distracting maroon words, unless I ":set nospell" at the beginning of all my vim sessions. That's going to get tedious. I already broke off the .vimrc because of no dropbox, so just edit it to change your spellcheck mode default. You can always spellcheck on other machines, which you'll be on plenty. But so far, that's the only downside of this experiemnt, and it has nothing to do with the Windows platform, but rather with my feeble Windows hosting environment (a VMWare Fusion session). And this is my first commit on Bash on Ubuntu on Windows. -------------------------------------------------------------------------------- ## Fri, Aug 19, 2016 10:10:01 AM ### Last Entry From Windows 7 Cygwin Shell Okay, one more journal entry on my Windows 7 machine. Ugh, because I run the VirtualBox cron job on this machine every night, I don't want to mess around with the Windows 10 upgrade to get the Ubuntu BASH shell on my primary working machine -- one less reason to want to use my personal Macbook Air all the time. Sigh, maybe soon. I should bring in some "real hardware" on which to run these reports. Actually, I tried with an ARM device, but that didn't work out too well. Okay, in the middle of making a video showing how to shrink a Windows 10 VMWare Fusion image after a Windows upgrade that plops a 19GB recovery-point onto your drive. Or rather than "plops on", it's more like "leaves a copy" that's de-referenced, but still protected. It's like deallocated bad bytes on a hard drive. But my work today is really a matter of synthesizing my new-style Python 3.5 with heavy leveraging of Jupyter Notebook for code development to my old-style Python 2.7 with heavy leveraging of dedicated server-like Linux instances with just enough to have a conservative but workable development and code execution environment, which currently is a Kubuntu virtual machine under Oracle VirtualBox on my Windows 7 machine. It's a bit of a multi-dimensional schism I've created for myself here, and it's time for me to start re-consolidating my personalities here, and that will for the most part be under: - Any OS - Superb software repo system to compensate for "any OS" (pip, conda, etc.) - Python 3.5 - Jupyter Notebook - vim - git (and usually, Github) - The typical set of Linux/Unix commands in a BASH-like shell - Rapid application-switching between full-screen apps By the above criteria, Windows 7 with Cygwin, VirtuaWin and Anaconda qualifies as a good development system, and indeed I do most of my daily work (with the exception of the journaling) on a system much like that. I use a separate Mac sitting to the side for the daily journal so that I don't even have to switch full-screen apps to jot notes in my journal (type into vim). I think a lot about absolute and relative positions, and the use of things like "indexes" to access different information (experiences) through the same viewport without moving your frame of reference. But sometimes moving your whole frame of reference is exactly what's required. Doing these switches throws you out of the flow or the zone usually one way or the other, so why not make a nice clean zero ambiguity, immediately understood by your "system-1" lizard-brain self? It's way easier to turn to another machine and jot a note on a different keyboard than remembering what keyboard invocation switches you to the next full-screen to the right (for example) and back -- though lacking a whole second system for daily journaling purposes, rapid full-screen-app switching without lifting your hands from the keyboard is the next best thing. Macs accomplish this nicely with full-screen apps, and Windows 10 only just got there with their versions of virtual screens, but probably even more useful full-scree-apps than the now-available "actual" Windows virtual screens. I feel like somebody's trying to make a point with Windows 10's implementation of full-screen modes (apps vs screens). However so far, I prefer Windows 7 with VirtuaWin over Windows 10. -------------------------------------------------------------------------------- ## Thu Aug 18 20:12:28 EDT 2016 ### Cats Not Really Gone Missing (I Hope) Shit, shit, shit, shit. Adi's... as fast as I started typing that, I take it back. There's a medium-dilemma going on, of Adi's cats in Staten Island appearing to be missing, but it's a big house and small cats. The person taking care of them hasn't seen them in days, and found a window open. I got that call fairly early today, but I've been in touch with the person feeding the cats and getting more info. Sounds like there was a 2nd (of 3) cat sighting. A dark blur running across, and that sounds like the cupatrabrah herself... in other words, Dorothy. I think Dorothy's safe. But wow, what a wake-up call. Got my adrenaline going good. It's that snap back into reality that everyone talks about. It's a real feeling, that fight-or-flight response. I guess sometimes we maybe ought to manufacture it happening more often. Is that the high of working out? Is that the benefit of exercise? Hmmm. You can always put off until tomorrow what won't get you thrown in jail for ignoring today. I am paralyzed by just where to start at my apartment. I've put it off long enough, "catching up on my sleep", and "decompressing" long enough. Now, I've got to get my act together. No excuses, and no distractions. I'm reading currently the Thinking Fast and Slow book by Daniel Kahneman. Man, I could have been a scientist like him, if only I hadn't misplaced my passions into Commodore so young. There's other excuses I could use. My parents could have guided me better, but ultimately we all take control of our own fates. The excuses run out. The only question about who ultimately has the most control over you, the system-1 or the system-2, in your brain. Are you actually ever really in control? There are so many thought-illusions out there, wow! Anchoring effects. The role of associative coherence in anchoring... man. -------------------------------------------------------------------------------- ## Thu, Aug 18, 2016 9:41:04 AM ### There's another name for Pythonista: Dunderhead! According to Thinking Fast and Slow, which I am currently reading, the System 2-type thinking that handles exceptions from the System 1-type brain, is both hard and requires discipline, and is the source of great advantage in certain situations where working on automatic will get you in trouble. While you can't really be "always on", in that it uses up all your energy fast, research does support that infusions of sugar-energy can put you back on track. But I'm sure we're trying to reach an optimal and context-appropriate state to balance our energy burn-rate (in part, adjusted knowing our accessibility to new energy-sources and our time-to-digest and make it available), and the size of the tasks we take on, considering their cognitive stress level, work involved, and often-times high risk of failure (assessment that it will be a bad investment) -- especially when considering, the lazy-way so often works. We're built-up around the presumption and systems accommodating the fact that things we experience once can be useful again in the future, and should weigh towards forever-being-refined automatic rules -- the System 1, such as it is called. This is why I write so much here. It is becoming clear to me, why "vim mode" hits a fundamental in the human experience and state of being. It's a vocabulary that comes natural, and dove-tails perfectly with other vocabularies we have, such as the English (or other) language; both written and spoken, and Python; mostly written, but actually spoken by in increasing number of dunderheads. Isn't dunderhead the ultimate in self-deprecation? Isn't that the ultimate in expressing what it is to be in the Python community, and the inextricable tying together of the new learner with the old hat? Hmmm, nope, it's registered and on sale for $3,999. Nope, no thank-you. Dunderheadin.it? For an extra $19.99 per year over the registration fee, I could do it. But again, nope. I just gave up MikeLevin.me, as that sort of silliness just isn't worth it -- except, in the case of MikeLev.in, which took over Mike-Levin.com for me, from back in the day when I was stupid and let my registration expire. My continuous domain history could have gone back to 1998, or whenever it was that I went by that. Sighhhh, okay, back to the 1, 2, 3... 1? Today is your day to nail this report modification project. Nail it. Hit this thing home. Get ahead of your work back-log. That's where your extra time with Adi is going to come from. Knock it out of the park... repeatedly! Become the Babe Ruth of SEO Data Science... for better or for worse, haha! We are all flawed individuals, and sometimes it is most fun when exceptionalism surfaces in the most flawed ones; lots of personality! >>>>>>> 548fb4ae5aeb5f3c58a475e0695c00515f08822d -------------------------------------------------------------------------------- ## Thu, Aug 18, 2016 9:35:20 AM ### How Useful Will Python Be After the Singularity? 1, 2, 3... 1? This is our perpetual state while conscious. We're stuck in a loop called "this moment". We're riding the read/write head of some 4-dimensional being who dropped their phonograph needle into our branes... or is that brains... or both... or neither. That just about covers how well we understand it all. For my money, Kurt Vonnegut got it right in Slaughterhouse 5. We're all exhibits in a zoo... portholes and infinitely improbable instances of what this universe can come up with, for better or for worse. God doesn't roll dice with the universe, but that is precisely what every human being is, among other lumpy units of matter/energy. Mattergy? Yup, it's the complete state of mattergy that would have to be observable and copy-able for Star Trek style transporters to really be possible. And the moral implications... Oy! Again, for my money I'd bet on wormhole-style portals being more viable and less of a disgusting mess. Avoid copying. We keep track of things often through references to instances of our images of the original thing that's files away somewhere on our drives, in our databases or in our brains. Our computer tech is a product of our own brains, after all. Why should we not be patterning them, consciously or not, after ourselves? References to original copies is efficient, but is not the whole story. Our sensory equipment, like our eyes, noses, skin, tongues and ears are scanners and the real-world application program interface (API) to our computing control center that deals with various autonomous systems, the sexiest of which is our thinking minds. But here, a lot of the similarities between our biology and our commonly constructed machines breaks down -- though we, with a shot in the arm by DARPA challenges, are trying to address that. Of how much use will it be to program Python, once a running program becomes smart enough to start improving its own code and append new resources to itself, indefinitely? What that moment is, of course, is new life -- and the singularity point of which Ray Kurzweil and Sarah Conner both speak. But it's unlikely to be some single freak instance in a lab. It's more likely going to be a Github project involving scikit-learn, tensorflow or their descendants running on Python 5 or something. So yeah, learning and knowing how to program Python will likely continue to be important. -------------------------------------------------------------------------------- ## Thu, Aug 18, 2016 8:56:03 AM ### Commit, Push, Paste, Commit, Push... And the Windows 10 Anniversary Bloat At work early, and back journaling using my Dell ThinkPad on Windows 7 under Cygwin on a VirtuaWin full-screen terminal window with a huge Courier New font, so that 80-column fills 2/3 of my wide-screen monitor. It's really easy on my increasingly feeble eyes. Reading Thinking Fast & Slow. I'm listening to the video I made on my phone on the walk to the subway station this morning. It's interesting switching between medias of expression, from videos on YouTube, to SimpleNote text notes on my phone, to typing into vim on one of various laptops and desktop Macs and PCs that I use. Hmmm, really want to get that Linux-first machine (no Windows/Mac VM host) into the picture. Resurrect my Commodore 64x? Very symbolic. Wow, I do have a vivid memory when I want to -- the C64x early morning debacle. Dropping convenient-lie F-bombs, in order to have the automatic and default winning position in any fight coming afterwards, or else I would be forever labelled as (and looking back to this instance as) the bad dad. Well, I reject that reality and replace it with my own, in which such tactics are met with a fizzle-inducing blank stare -- or not even the courtesy of answering the phone (or whatever communication method being used). Yup... yup. Had to delete Kubuntu 64 from my 2011 Macbook Air laptop's install of VMWare Fusion, because the Windows 10 virtual machine is demanding so much of the machine's hard drive space. I can't believe what a hog Windows 10 really is / must be. I had to leave the install going overnight just to get the Windows 10 Anniversary Edition installed, and it froze up in memory suspension mode overnight, so I'm still doing the install now, which is alternating between that blue screen with the spinning dot-wheel, whatever that's supposed to be, and a black screen with a simple Windows logo in the middle, and an entirely black screen that forces me to confirm that the whole thing didn't crash -- from which it returns to the blue screen with a new percentage-complete number being reported for Working on Updates. Sighhhh, the better Microsoft gets, it's still Microsoft trying to satisfy every last edge case of heterogeneous hardware -- and we all suffer. I'm making a video as I go... hahaha. Make sure you get the updated reports (at least, their data sources) done today. Ahhhh... 1, 2, 3... don't forget to paste in your writing from your subway ride. I really should find a way to start marking up the stuff that I want to publish as stories. Well, I could certainly just make it it's own journal entry instead of just pasting it below. And so, commit, push, paste, commit, push... -------------------------------------------------------------------------------- ## Wed Aug 17 10:53:11 EDT 2016 ### In The Melody of Fiddler On The Roof: Transsssition! Cut! I'm not going to stick to one journal entry per day. It's all about thought-transition. Forcing a thought-transition event is sometimes important in thinking about something fresh. Okay, my strategy has got to be to just program this as a Python 3, SEO Notebook integrated "thing"... a separate product, if you will. For pulling this list ALMOST IS the finished product. What CAN'T you tell from searching and sorting this list? WHAT LIST? That's precisely why I wanted to break this out into Python 3 and SEO Notebook. WHAT list is actually a very interesting question. I'm thinking this may be my first utilization of Pandas and DataFrames in an SEO Notebook project. SEONB itself doesn't need Pandas for the basic pipulation process, however... this preprocessing stuff can significantly benefit from Panda -- especially doing SQL-like data manipulations without SQL. -------------------------------------------------------------------------------- ## Wed, Aug 17, 2016 10:28:59 AM ### Tending Your Own Career Like a Business (even if you're a clock-puncher) Okay, I have my Mac back and am charging it now. Meanwhile, start my journal here. Look at the day. I think I'm going to bow out of the tech sprint meeting this morning as well. I got to work a bit late AND let myself dive into a couple of discussions this morning. I'm listening to Talk Python To Me #71 about Soft Skills, and Peter Drucker style concepts (though, they don't mention him so far). We are each our own business, and people don't think they need enough skills to treat themselves as their own little business, even if they're clock-punching employees. People skills are more important than increasingly commoditized programming skills. It's not just about coding. It's about communicating, and making the team work better. Most of your time is not spent writing code. It's spent writing emails, talking to co-workers, planning next steps. If you're not good at those, you're lacking in the majority of what you do -- however, I'm a bit different. My "people skills" are often expressed in the thought-vomit I exercise here. I'm processing my thoughts and ideas, is what I'm doing. It's a lot like talking to myself. I do write my weekly reports for my boss, and that's really been something that helps -- and even though I've TRIED to do this in the past in other situations, it's either been un-asked-for, or it's been annoying over-detailed time-tracking systems (for sox-compliance, at agencies). My weely report this week is due this Friday. It makes my work more highly visible. Have notes. Not just my own personal notes (this), but also my actual weekly reports that get sent out officially to my boss. Think like a business. What does my manager need? I think I had a talk with him this morning that LETS ME KNOW what he needs (whether he knows it or not) from me. It's the portability of processes via Jupyter Notebook... wow! How clear. Such moments of clarity in one's career I think are quite rare. It's a real shift in thinking for me. Less vim and more Jupyter Notebook. Okay, get today's journal-writing off the Windows laptop... okay, back on the Macbook Air. 1, 2, 3... 1? -------------------------------------------------------------------------------- ## Tue, Aug 16, 2016 12:13:20 PM ### ShelvItAll comes and ShelvItAll goes Okay, the trick here is that the things I code in shelvitall can work totally standalone, in other capacities than SEO Notebook integrations. I will be venturing into object oriented-land through the traditional Simula-inspired (or maybe Modula-3?) class API... ugh! Avoided for very many years, self, self, self. But now is time. Tue, Aug 16, 2016 4:36:48 PM Nope, changed my mind. No reason to go OO on this project. Went into it a little bit, and felt OO working against me and not for me, and so back to classical functions and procedural programming. Also, I feel the elegance of SEO Notebook slipping away as I introduce shelvitall... ugh. Okay, delete it and rip it out of the repository. Leave the pervasive_data object in location, and use the pipulate.ipynb file that. That's what it was starting to look like, anyway. -------------------------------------------------------------------------------- ## Tue, Aug 16, 2016 10:03:49 AM ### Adding persistent data module shoveitall to SEO Notebook Okay, I'm doing my journaling today on my Windows laptop. It's screen-swapping hell, today I guess. But I forgot to bring in my Mac charger, and don't want to go on a scavenger hunt. Maybe just hit up eBay for another (and Amazon for a bunch of regular stuff I need again). Don't forget about yourself -- although, you MUST be more productive when you're at home. It's almost like a feeling of helplessness that sets in at home. The trick today is to totally avoid distraction, and to get this latest work done. But just as important as the "get it done" mentality, is simply to make it interesting and loveworthy today. I am falling in love with both Jupyter Notebook AND the process of moving to Python 3.5. To simply execute this project under vim in Python 2.7 on a Kubuntu VM that doesn't have Anaconda installed... well, it's motivation-busting. Solve that. Okay, first (and last!) distraction. Couldn't resist my curiosity about SymPy, and I am soooo taking up Algebra again (in advance of having to help teach Adi) BECAUSE Python makes it so easy. Here's the video that sold me: https://www.youtube.com/watch?v=cvHyaE_bs8s I now want to make my own video of this. This guy is actually pretty good, but my series is a tour de force, producing videos every few days, covering nearly every topic. But don't force it. Be genuine. Let the wind fill under your wings and start to lift you over as long as it takes to lift. I'm a long runway type person. Give it a few years now. Do a few more interesting dot-connecting projects. Start referring people cross-channel in all the content in all the channels you publish on. Go find your audience, corral and rally them in. Do that mailing-list shit -- finally! You were so on top of all this at Scala, then just let it go when you stopped earning commission on the sales you created. Now, it's time to break-down and tackle this project. Do it the most modern way you WANT to do it, which is also easily transposable BACK to the Python 2.7 / vim / cron world that I live in back over on the Kubuntu virtual machine. So, do your 1, 2, 3 breakdown... 1? In my heart-of-hearts, I want to incrementally and cautiously advance the SEO Notebook project. I ACTUALLY want to switch over to SEO Lab, because I feel the "big payoff" of a systematic input/output widget system to incorporate JavaScript-driven browser interactivity just on the other side of one alpha-software install away from achieve-ability. Not that I'm going to do it, but when the time comes: - https://github.com/jupyter/jupyterla://github.com/jupyter/jupyterlab Okay, I'm convinced I DO NOT really want to switch to Jupyter Labs right away. It just confuses matters. And almost everything I do in Jupyter Notebook will be applicable to Labs, so long as I plan my APIs correctly, so I can rewire I/O rather easily -- no prob. Get on with 1, 2, 3... 1? Well in this light, step #1 really must be figuring out how to "wedge in" special jobs into SEO Notebook. Tasks such as the one I'm about to perform cannot possibly be external to SEO Notebook. Sure, we're going to generate 15K+ rows worth of data, but I could just drop it into a CSV file. I'm essentially refining the Pipulate convention. Conceptually, what I'm doing is a sort of pre-processing job that creates the rows in the first place, for subsequent pipulation. The fact that the resulting data from the preprocessing might end up in a csv file versus a Google Sheet is of no consequence. It should be a targetable thing. Hmmmm. I think we're talking about another sub-module of SEO Notebook, just like the way there's a Pipulate and GoodSheet module, I think there should be a... hmmm... let's see... I guess it must be... What we're talking about is having data on the back-end of arbitrary size and shape, with which SEO Notebook and Pipulate functions can interact with freely and easily -- very little overhead of external systems, and the easy ability to port data back and forth between Python 2.7 and 3.5 (for example) and make the data persist between executing instances of Python. In other words, we're talking about a Library of arbitrary Python objects... shall we call them Dictionaries? And so... I think we're talking about Dictionaries (generally speaking) in Shelves in Libraries. It's a DataLibrary. Inside the DataLibrary, you can find whatever help is necessary to grab-out and put-into and discover-the-nature-of whatever data objects you have deemed important for longer-term (longer than that one in-memory session of Python) storage. It's essentially merely USING all the pickling and shelving facility that's so conveniently and courteously built into Python. It will apply everywhere in SEO Notebook, providing an important missing (and long-danced-around) piece of the Pipulate puzzle. "Next to" Pipulate and SEONotebook and GoodSheet lives another module named... it's got to belong to this set; what will look good? - seonotebook - pipulate - goodsheet - shelvitall Of course! Shelve It All! ShelvItAll. shelvitall. It's got a wonderful texture to it, and is dramatically different, yet somehow belongs with all the rest of the modules. Go ahead and make that friggn' file to sort of bank it and homestead in that noosphere! I've noticed dip has stopped thumbs-downing my videos. He's either on vacation (from what?) or has recognized the futility of defining himself by how he stalks and thumbs-down me, simultaneously trying to ape me on almost every -- thoroughly unpredictable (haha!) -- avenue. He didn't see Anaconda / Jupyter Notebook coming, I'm sure. And I already have staked my claim on Jupyter Labs as the platform for product development that inherently teaches you programming and creating customizations, by the very highly documented and accessible and internals-exposed nature of the product -- insofar as internals are the content of the code you're running, and not the code that makes Jupyter Lab itself work. Important distinction. It's still just a glorified Python virtual machine rack with a really good web browser user interface. Or, should I go with shovitall? shelvitall? shovitall in a Google Verbatim search comes up with 19 unique pages, while shelvitall comes up with only 1, and it's an OCR text extraction: https://www.newspapers.com/newspage/288207/ of an advertisement from June 13, 1956. This is about as close to a made up word as I need to get for this purpose. shelvitall, it is! -------------------------------------------------------------------------------- ## Mon Aug 15 14:14:13 EDT 2016 ### Focus! I now need to zero in on this next round of report updates, systematically and diligently. The closest thing to this is what I did for the Menu project not long ago, when I went several times more than 5000 keywords deep in the Google Search Console results. I would like THAT capability just to be an easy-to-accomplish item with SEO Notebook. It is after all one of the most obviously necessary tasks for an SEO. It all starts out with a sort of data pull that exists separately from the greater report project. My Mac is running low on power I need for an upcoming meeting. So, commit and switch over to the PC for the rest of this project's documenting. Okay, I'm over on the PC now, even for this journaling stuff. I have a meeting coming up in about 5 minutes. Okay, it's over, and even been to the bank to deposit paycheck. But now... but now... how do I get this far along quickly? Go look at the actual report logic. It's going to be all about dipping in and out of this virtual machine. I want to be very careful, because report.py is created for Python 2.7. I wonder if I could get it running on 3.5... on Windows. That would get it into my main Jupyter Notebook... and THAT could be very useful with the reports. I could see what I could do with this report code when moved over to my current preferred environment. And that way, I wouldn't feel so shitty about a refactoring like this. Hmmm. Ugh! Yuck! Okay, Uncle! As usual... gotta go back into the virtual machine and just suck it up and do the work there, using Python 2.7. THIS is why so many people are stuck on 2.7. No matter how you slice it, it's work porting things to 3.x, and it's a whole round of little changes -- not merely a few tweaks to get the code running again. Okay, I'm stripping out all the Google Docs stuff from reports.py, and then I want to nearly exactly reproduce the logic that I did in the Menu project, but way up-front in report.py, and pop it all into some sort of global dictionary object that I can join to in later steps. I need to merely spin through each table once, adding a new column for each URL... is that right? Each URL is going to PROBABLY have multiple entries from the new dictionary I'm looking up. Hmmmm. What becomes the key in this dictionary? It MUST be URLs. But is that a key, really? Do I want to make multiple entries on a URL column, because that URL occurs multiple times in the Search Console dictionary? That MAY be the case. Ugh. Okay, this is NOT a complete refactoring of report.py. No, no, no! It's just adding some SERP data to existing URLs. Otherwise, I have to add a ton of rows... "expanding" the rows. Think! -------------------------------------------------------------------------------- ## Mon Aug 15 10:10:03 EDT 2016 ### Monday Morning Oh, it's Monday morning. I have to get my report to my boss, ASAP. Think through last week. One BIG thing to report. Various small things. Push forward harder. Things should be getting easier for you now. Make sure that they indeed are. "Bottle it". First order of business, I detected an anomaly in a reporting system, so I got out the necessary emails to get it on people's radars today. Now, hop to your weekly report! Okay, done. Only barely 11:00 AM... good. Remember, I don't have my Mac charger cable today, so be aware of your battery life on this machine. Commit and push often, so that it's no biggie when the battery goes dry, and then bring your charger in tomorrow. -------------------------------------------------------------------------------- ## Mon Aug 15 10:05:25 EDT 2016 ### Listing Some of My Life's Learnings It's hard to imagine that after such an entry as below, my commute actually can sustain that momentum, as I continue capturing such thoughts on both video (publishing as I type on YouTube) and in notes tapped into my phone, which auto-syncs to my desktop at work, which is where I'm typing into this distributed revision controlled textfile now from. My typed-in notes from the subway ride is this: Continue your campaign of conducting each day like performance art. Keep pushing out videos that draw a picture of my worldview and passions. Don't pose or intentionally copy or ape anyone else's work. Do your own thing. You can happily riff off of somebody else's work, making a great example. Always try to be improving your own personal state of the art. Be proud of it. Your greatest work is built on the work of others. We are all deeply interrelated. Pride is not a sin; it's part of the type of loving yourself and loving life that helps. And self-love and pride in ones one work therefore is love in others, like it or not. So, loners should open themselves to the communities they backed into. Get over your inner self that rejects interaction, steering towards isolation. We are social animals and need that social interaction to not slowly go crazy. We'll always be the animals evolved to deal with the world at this human scale. Death means that a new generation will always take over and have their chance. -------------------------------------------------------------------------------- ## Mon Aug 15 08:05:48 EDT 2016 ### A Few Thoughts About Reality Human conciousness and free will is the icing on top of the physical laws of existence. We are probably a fuzzy fractal edge case -- a freak reality hack. As such incredibly improbable unit-blobs of matter, we are each being given a chance to enjoy, bellyache-about, ponder, or otherwise experience reality. It doesn't take many thought experiments to run up against concepts that challenge the imagination, like how big is the universe and how small can particles be. Not everyone goes down these thought-experiment pathways for long, because it rapidly becomes uninteresting and a lower priority to just staying alive. But for those who do not relent with these questions, some very small number of individuals just may make a discovery that changes everything forever for everyone in some actual meaningful way. The accomplishments of people like Einstein, Planck, Feynman and others push forward the boundaries of "what is real" in a way that rewires our thinking about what reality actually is. Everything appears to be in some sort of mutual orbit around something else, with the more massive body generally being the center, and the speed of rotation increasing the closer the smaller body gets to the larger. And many things that we once thought of as particles also appear to have wave properties. Figuring it all out will have much to do with resolving this particle/wave duality in so many things. Reality itself appears to be some sort of all-inclusive system, of which we as individual human beings appear to be a part of. What I used to call existentialism, I'm coming to realize is actually solipsism. Solipsists can't easily be proven wrong, but probably are anyway. -------------------------------------------------------------------------------- ## Mon Aug 15 07:14:38 EDT 2016 ### Structure Your Days Better I basically ended up sleeping all my free time away, and watching Richard Feynman on YouTube. But then, I guess I needed the sleep. I'm still on a sort of recovery rebound, with the first-order healing having to do with sleeping and mental conditioning. But I have to start doing my physical environment conditioning. I have one check to write and deliver this morning, but besides that, I'm still in pretty good shape... today. But it won't stay that way. Get more on top of things. Structure your days better. -------------------------------------------------------------------------------- ## Sun Aug 14 11:14:30 EDT 2016 ### Richard Feynman -- Listen To Him More Often Just did another auto merge. Will sort out what should be where later. Today is about using today. It's already after 11:00 AM, but I needed that sort of lazy wake-up. Used to be my regular thing in my youth, and now get a morning like that every 2 or 3 years. I'm shortchanged on many things in life these days, but that's part of the exchange of becoming a father. But I don't want to be quite so shortchanged anymore. This is one of those days that can't be taken away from me now with surprise change of plans that consistently screw-over all my attempts to dig myself out of my backlog on the home-front. Well, I've got today for that now, if I can even keep on-track a little bit. 80/20 rule, and commitment and consistency. Go get some energy inside of you. Slow, easy burn with an initial pop. Pepsi, then ramen noodles. Sun Aug 14 12:23:56 EDT 2016 Puttering is okay, so long as there's positive forward movement in your life as a result of puttering. Puttering is like that deep breath at the beginning of a larger task. It's like taking stock... ugh! Billy keeps walking between monitor and keyboard. In all things are fundamental lessons of life. The generalized abstract rules of things are right at the surface, waiting to be gleaned by wrinkly monkey brains. Cats are evolutionarily speaking the enemy, and it's a tiny miracle that we have tiny cats as our loving and exasperating companions in life so frequently. I am definitely a cat person. I am definitely NOT a dog person. I do indeed prefer a smart asshole roommate over a mentally challenged patient in my care. Finally watching some Richard Feynman YouTube videos. I don't know why it took me so long to do that sort of searching in YouTube. Hmmmm. Lessons in that alone. That one of these minds was recent enough to have lots of video to watch on YouTube... wow. Would have been interesting to hear Nikola Tesla and others talk as well. -------------------------------------------------------------------------------- ## Sat Aug 13 14:28:47 EDT 2016 ### Word And Weird Are Both Weird Words Wow, back from to one-day weekends at the Catskills. Weird. Weird is the word of a generation, isn't it word? Word and weird are both weird words. Now, I'm not saying I'm providing the one true path into the future of tech literacy. But I am saying, I'm about to riff on the Michael Kennedy of Talk Python to Me podcast fame's Python educational series. I'm going to go through all of them. I'm going to absorb and apply what I learn. And then, I'm going to figure out how to incorporate this into my daughter's homeschool education. I've already had the privilege of teaching Adi Dungeons & Dragons for a game probability math class she's signed up for in a few weeks. I skipped over a lot of background building and jumped head-first into win/loss battles with being taken captive or turned sides at stake. Next week: hitpoints! Woot! My daughter's learning Dungeons & Dragons, and it's something she HAS to do for her education. Oh, what a different life she's leading than I did with my idyllic suburban ticky tacky house experience. And now off to do exactly what I warned myself not to: Mr. Robot! Ugh! I will be productive when I recover. Got a close parking spot, but don't want to keep it there. Also need an oil change, and could use a wash. Be a responsible person who cares at least a little bit about getting a good return on the things you invest in, and putting on good appearances. Don't become TOO eccentric. Entertain your impulse to entertain with impulses. Design those impulses. They are sensory-input experiences of varying degrees of interactivity. Some are only pictures, or diagrams, or other things designed to have the story-telling worked-for in a decoding process requiring both imagination, cultural and societal context, and most likely some reading and writing device like stone tablets, paper, or digital media. Publish. Make the stuff you publish have unique value, because nothing like this has ever been encoded and decoded before. Sprinkle the seeds of your brilliance out there in a multitude of those virtual communities that become the focal-point of human attention, be it through a computer, or in-real-life meetings, writing a book, or otherwise YouTubing, Tweeting, Tumbling, SnapChatting or whatever your way through life. Whatever you do anyway, sprinkle those first-dose-free bits, or like I'm doing, the raw, unedited idea-collection data stream layer. In other words, lifecast if you like. Expose "who you are" to the world a bit, and build some genuine audience and following based on that. I did amongst the world's first Raspberry Pi unboxing videos. A silly, small thing, and it's one of the few videos that I actually ever edited, even a little, before publishing. I'm one-take-Mike on most occasions. If I don't like what I recorded, I don't publish it. In fact, I delete it, never to be seen again -- unless I was recharging with WiFi available long enough for Google to sync it all up. In those cases, I then have those. This is what I've been doing more-or-less since that fateful Raspberry Pi unboxing video -- vlogging. Barely even called myself a vlogger until recently. And sometimes, I throw in what I'm doing professionally, and I even had one stretch of unemployment in there where I tried to so consulting, but fell too in love with the Python coding-up of a system that I needed in order to still have my "Secret Weapons" in this rapidly evolving world of ours. I don't want to become obsolete and overly-seasoned people too young (and without savings) to retire, and too old to land the really exciting job opportunities. Nope. Quite the contrary, I think the world needs me for my often contrarian views. I like to execute my code in little work-sessions, and rarely keep them running as a broad-use-case server for all future such queries. Nope, I set the stage for an investigation, I carry it out -- often transforming data one or two degrees away from raw, into some derivative indexy thing -- and keep track of the process so it's reproducible, analyze and notate the results, and forward it onto people who are in a position to take further action, based on my analysis. - SEO - Machine Learning - Cross-platform, cross-language data ninja tricks - A very Python programming language centric view of the world - A grand tour of Jupyter Notebook and the move to Jupyter Labs - Building full free and open source products under Jupyter Labs - When Big Data and when small-but-good data, per goals and purposes - vim, vim, vim, vim... emacs w/evil-mode... someday, maybe - git - Loosely all Linux, but mostly small non-gui IoT-style ones - History. Lots and lots of boring compsci history. - SciFi fanboyism and heckling Ray Kurzweil until I eat crow - A life of intentional subjugation to the blub paradox. - A loner's reluctance in actually engaging with the Python community - A dad figuring what of what I'm doing apply to my daughter's education - A dad just trying to be a good dad. - A guy just trying to get by. - My own approach to things. - Questioning, challenging, double-checking most important observations. - A deliberate methodology around cherry-picking next experiences. - A deep, personal internal life that I most easily share here. - Thanks for joining me, hope to see you again soon, and don't forget to subscribe. -------------------------------------------------------------------------------- ## Wed Aug 10 10:52:45 EDT 2016 ### Breadcrumb Project Complete Okay, this breadcrumb extraction is still taking longer than I think it should, and I'm worried that the http calls are still being made around the cache, and I want to add some sort of visible feedback. I don't want to rely on color coding... at least not until Jupyter Lab... Wed Aug 10 13:17:07 EDT 2016 Okay, I extracted all the paths. I had to refine the regex pattern a bit to accommodate the various patterns out there. I had to switch from Python's default regex trick of returning the first parenthesis group as your one-and-only match (not having to worry about matching groups) over to matching groups. I had to switch from findall to search, and I had to use a Python named group in order to explicitly choose the correct matching group. Ugh. But at least, I have a really good example now in the Pipulate functions of how to do this. I can see that all my old Pipulate functions from the Pipulate repo will be re-written (ported) to the new system, with the API adjusted to the new system, and improved for readability, code re-use and general elegance. But more immediately, I need to pivot into the second half of this project, which is making it a surf-able hierarchy. This is where I get to the 1, 2, 3... 1? approach. Ugh. 1, 2, 3... 1? Done. Delivered, and BAM! Off to the Catskills. -------------------------------------------------------------------------------- ## Wed Aug 10 09:03:59 EDT 2016 ### BAM! Now Pipulate Functions use Python Decorators Wow, okay. Conquered Python decorators, finally. Wow, the improved beauty of the Pipulate functions is going to be very impressive. Can't wait to dig into these things now. I have something impressive to show, and the Jupyter Notebook embedded markdown is going to be where the context-sensitive documentation is going to reside. Greetings all! I figured out how to conquer the world's proper tech education deficiency, picking up where the Raspberry Pi half-heartedly left off. The Pi in Raspberry Pi is for Python, after all, and I think they allowed it to become something much bigger by not forcing it to be Python-exclusive. So, they changed the name -- at least, in spirit. Familiar story. Same as IPython Notebook. Now, it's Jupyter Notebook -- all in the name of removing the Python name. Aside from being objectively the best programming language to take-up for a life-long love for coding (a decision even the #1 LISP Ludite Club (MIT)) came to acknowledge and refactor an already legendary intro-to-compsi course. Yup... Python won. But we can't call it that. Decorators are yet another way to back-into the advantages of Object Oriented design. I HATE defining classes and methods and init classes and callables. Ugh! How could anyone actually WANT to think like that. No, no, I get it. It made possible huge advances everywhere, and is today inextricable from compsci reality. But the traditional OO APIs suck. Mikey likey file-based Python modules. Creating a filename.py is the same as creating a Class definition in more traditional OO languages. Every variable set inside modulex.py can be addressed as modulex.y, such as it were. This is the same as what we call properties or attributes in other languages and traditional OO interfaces. Python namespaces; what a wonderful way to back-into OO design patterns. Python decorators are yet another way to back into OO design. It took me a good long while before I groked them. I first really encountered them in the Python Flask Web microframework. Flask is the rough equivalent to turning Python into a PHP-like template-based language. However, instead of writing individual files that map to individual pages, you write request-handlers. And to all appearances, request-handlers (or routes in Flask terminology) are decorated functions. The decorations in flask look like: @app.route("/") def hello(): returnrender_template('hello.html', key=val) This is basically saying: Any request for the website's default page will result in showing the hello.html template sitting somewhere, fed whatever key/value pair dictionaries I want to make available during the rendering phase for some dynamic page components. Yet, the hello function itself knows nothing about rendering a page. It's magical. Not so magical. Inside the instance named app of the Flask Class that is created somewhere else in the app is a module named route. The route module takes parameters that stand for the path portion of http requests that are coming in. That route function was specially written to act as a wrapper to other simpler functions that only need deal with doing a bit of stuff and handing operations onto the templating system. Huh? Well, if this is tough to absorb, just consider that all languages have syntaxes that we take for granted after using them for awhile. Not understanding their internals is not an impediment to using them. Case in point, I bet only a fraction of the people using Python Flask have any clue how this decorator magic enables the framework. Isn't it enough that it works and it beautifies it? Nope. Flask's routing is just saying this: def hello(): returnrender_template('hello.html', key=val) app.route(path="/", func=hello) See, it's that simple! I may have the attribute names wrong (didn't bother to look), but I'm hoping you get the point. The hello function is actually being fed INTO the route function (or method) of the app instance of the Flask class (or module). Decorators are just slightly cleaner shorthand notation. Less parenthesis, basically. Mentally, the focus is on what "hello" is doing, so why not give it all our focus, and only just lightly "decorate" hello with the pesky detail of the fact we're really just defining a function intended to be fed into another function as an argument? No reason why not; this is Python! So my use case is exemplary for decorators. The url function below essentially intercepts a request for Title(**kwargs) that contains within in kwargs['response'], which is the entire response object from a request like: import requests response = requests.get('http://someurl') title_tag = Title(response=response) Now normally, this would shift a lot of responsibility onto the Title function to "unpack" all the stuff needed to extract the title tag from the HTML text portion of the response (the stuff you see when you "view-source" in a browser). Alternatively, we could shift the responsibility to the portion of the parent code immediately before the Title call, but who knows if that's the sort of unbundling every extractor needs (it's not; title only represents simple text-node extraction (versus meta tag attributes, etc.)). Maybe you want to have a series of externalized code-extracting pre-processors, which you can easily pick from... using... drumroll, please... decorators! This example additionally externalizes the text node extraction, which is a step "down" or "inwardly nested" code extraction, more easily accomplished with traditional code breakout techniques. Only simplification in the "up" or "outwardly nested" direction requires decorators in Python. And so, we simplify up with the decorating url funciton and simplify down with the extract_text_node function, and look how simple the Title function itself becomes. Think how beautiful this is in a long-term code maintenance perspective, or efficiently slamming out variations of code-extractors: def url(passed_in_func): """Decorator for functions like Title to pre-extract html text.""" def requests_wrapper(**row_dict): html = row_dict['response'].text return passed_in_func(html=html) return requests_wrapper def extract_text_node(html, tag): """Returns text node string for tags like Title. Simplifies common scraping functions.""" pattern = r'<{0}\s?>(.*?)</{0}\s?>'.format(tag.lower()) compiled = re.compile(pattern=pattern, flags=re.DOTALL) matches = compiled.findall(string=html) if matches: text = matches[0].strip() return Response(ok=True, status_code='200', text=text) else: return Response(ok=True, status_code='200', text=None) #### The Big Payoff of Using Python Decorators Look how simple the Title tag extractor function itself (below) becomes. It would be this simple too with JUST extract_text_node IF Title were only being fed the view-source HTML of the page itself, but it's not. Title is being called with Title(**kwargs) arbitrary set of key/value pair inputs, and one of those key/value pairs is the entire already-fetched response object of a Requests method webpage fetch. The decorator steps in to "strip away" all the excess crap on the invocation argument (a convention in the system I'm developing). So now, we get the best of all worlds. I don't have to "pollute" my simple extractor-functions with overhead due to conventions of the system. Decorators clean the input arguments first, so that all that reaches the Title function is the pure view-source HTML text snagged out and passed on by the decorator. BAM! @url def Title(html): return extract_text_node(html=html, tag="title") -------------------------------------------------------------------------------- ## Tue Aug 9 19:37:20 EDT 2016 ### Facebook Video Chat is the casual telepresence winner Have to finish the breadcrumb project from home tonight. Also want/have to video-chat with Adi tonight over Facebook. Facebook's video-calling ability appears to be the one that's rising above Skype and Hangouts for me. Not sure what it is, maybe it's the most platform-independent one, or so many people are so frequently actually running (or are always ready to run) Facebook. -------------------------------------------------------------------------------- ## Tue Aug 9 10:09:53 EDT 2016 ### Extracting Breadcrumb Trails From Web Crawl Data Still feeling yesterday's Malaise a bit. Analyzing traffic patterns across sites. Mentally factoring in: - Seasonality - Pokemon Go But even after that, I think I can see these Google "Phantom" algorithm recalibration tweaks. Okay, get the work done today that you have to get done. Keep a big picture in mind. Consider making few of those videos, now that you're up to more Pipulate functions. I got my request-for-timeoff out through the HR system. Okay, now focus like a laser beam on this hierarchy breadcrumb trail thing. Tue Aug 9 13:55:54 EDT 2016 Okay, investigated Python decorators today. Even bought a new dedicated Kindle book on the topic. I'm going to try and experience and let them sink in. But I'm getting the feeling they may NOT be the way to expedite or make more elegant and long-term maintainable today's work. Nope, simply not polluting the main repository in Github with day-to-day work is the real matter at hand. Tue Aug 9 14:51:58 EDT 2016 Implemented decorators on Pipulate function. Woot! This makes the Pipulate functions much simpler, and less dependent on knowledge of **row_dict structure. row_dict is now "unbundled" by the decorator function BEFORE reaching such functions as Title, which can now use a simple "html" as the input parameter argument. Tue Aug 9 18:12:20 EDT 2016 Okay, breadcrumb trails being extracted from crawl data wonderfully. Do the path-to-surfable-node-tree transformation from home. Take some pressure off of yourself. -------------------------------------------------------------------------------- ## Mon Aug 8 09:52:21 EDT 2016 ### Monday Morning Malaise Wow, my last entry was Friday. But not really, I know. I have a paragraph or two on my Microsoft Surface Pro 4, which was a real highlight of this weekend. This shall be remembered as the weekend of the re-introduction to Charlie and the Chocolate Factory. Adi was REALLY into it. Adi also lost one of the white goo fillings she got. Hmmm, I have to get to the dentist, desperately. My mouth, as Adi will herself happily repeat, is a mess. Yup, never really did the 2 cleanings per year you're supposed to. I'd be lucky if it were 2 per decade. Oh well. I have some getting back on track to do. Check those things you check in the morning to be sure nothing explodes in your face, and then get your weekly report out. Mon Aug 8 10:32:51 EDT 2016 Okay, the weekly report is out. Those feel good. This job is much less scatterbrained than agency life. I have a great project to do right now, breadcrumb trail extraction from the collected response objects, against a list of about 20K URLs that live in Google Sheets. Shelving the response objects for 20K URLs on this particular site took just over 3GB. Not a file I'm going to keep around for ever. Interesting thinking of that sort of thing as a temporary cache now. Definitely liberating, but don't delete by accident. Hit this project home and get onto the next, and then to SERP Archiving! Woot! Mon Aug 8 11:53:21 EDT 2016 I'm making it so that if you try to Pipulate a large spreadsheet with nothing to Pipulate (an already completed job, for example), then it won't try to process any chunks. Mon Aug 8 16:57:39 EDT 2016 Gonna head out a bit early today. -------------------------------------------------------------------------------- ## Fri Aug 5 16:01:37 EDT 2016 ### Gonna Make SEO Notebook Fully Operational Okay, let me just slam out a long-ago requested deliverable that I keep putting on for some reason. But it's a union statement, basically. Fri Aug 5 17:16:21 EDT 2016 Okay, finished that project for my co-worker AND just shot the video just now that deals with the Google Developer Console and the authorization file with the Google client_id and client_secret. Wow, this will really let SEO Notebook just be downloaded and used the way I use it! Just get this video published, then scramble to get to Adi in the Catskills as early tonight as reasonably possible. -------------------------------------------------------------------------------- ## Fri Aug 5 15:04:25 EDT 2016 ### Our Privileged State of Matter Wow, okay. It's already 3:00 PM on Friday. I want to get out on the early side today. Let me dump some of my recent SimpleNote writing in here, starting with this morning's: Formalize your systems (Publish this) Systems are everywhere. Systems are everything. Even the reality we each experience every day is a system of matter-containment and automation, sensory input, processing, control and output. As collective human knowledge advances, so too does our understanding our existence as information, chemical reactions and indeed, very advanced biological computers -- us. That's when it gets all meta and interesting. Industrial revolution to now. Now to 100 years from now? Wow. Tend your systems. The very Star Trek energy bubble beings that some of us aspire to evolve into are already us. Sure, maybe we have some evolving to do, or improvements to make over basic human biological standard, but we're at almost the perfect scale and frame of reference to fully and properly enjoy being a part-and-property of this material universe. Woot! Let's party. And let's codify some optimistic human assumptions, like our beings have value, and all such beings have individual rights that balance each other out with the rights of others like you around you. Basic golden rule stuff. Fast forward. We also, if we value our collective being and the world our children will be born into, then we do have steward-of-the-planet issues to attend to. These are life's highest callings. We all, each, can make a difference. And some will make a superhero-scientist level of difference and will be recognized and celebrated for it, while others will do things for the sheer reward of them themselves only knowing. Others still will crave reward, and never earn it, right as those who never warned it are lavishly rewarded. Life ain't fair that way. But you can stack the odds in your favor. Have a positive outlook if you can. It will improve your quality of life. We hit the jackpot as far as matter in this universe goes. We are bright, and we are zipping around like particles on the surface of a foamy surf, controlling our own paths to some degree. -------------------------------------------------------------------------------- ## Fri Aug 5 07:53:15 EDT 2016 ### The Perpetual Newb Will be in Catskills this weekend, at Grandparent's house. Foster what you're becoming, not what you are... funny at 45, imminently going on 46. Continue evaluating and knowing what's important in life. Don't miss any windows with Adi that you'll regret. Help shape her life while it's easy. Revisit Sun Tzu. That started fixing your mental chemistry. Thanks, Howard Diamond from Commodore. I think I may have discovered it later, but he introduced it to me well, and just in time to still make some difference, I think. Now finally listening to Ruby Rogues. Good podcasting, even though it's about Ruby. Don't start these series from the very first episodes until you've listened to a few new ones. Refine your elevator pitch even better: The Perpetual Newb: Forever Learning Linux, Python, vim & git -------------------------------------------------------------------------------- ## Thu Aug 4 23:14:03 EDT 2016 ### Mr. Robot is some heavy shit. Again, no content, but big enough to be worth mentioning. Shit, I am so the 1%. I should never let myself really wallow in or descend into depression. I got my shit together so well. Keep developing my mad skillz, but then be a fairly conformist good guy. We're all just going to die in ~100 years, anyway. Just try to leave the world a little better for our children, and the specific instances of children that we helped instantiate a little better prepared for that future. BAM! -------------------------------------------------------------------------------- ## Thu Aug 4 21:19:36 EDT 2016 ### There's a Paragraph For You I have to start using this journal at night. I remember back when I first used to write, how this would literally be the rudder steering my ship. And a shitty job it's done of it too, but in all fairness, my life went wrong the moment I was born the younger natural child to an adopted first. BAM! Wow, I didn't realize what a job that did on your head until I was in my forties. I always just considered myself an easy-natured self-entertaining loving my sciencey and art stuff kid. I don't think I ever stopped considering myself that, but there was other stuff. Distribution of resources kinda stuff, where someone with my potential hadn't particularly had my flames fanned. And only in my forties now, and I really fanning the flames, and only then because my journals that I've kept, since I was about eighteen, have kept me from going totally bonkers crazy as my went crazy and died, and I had to run a check cashing business, carrying out the will of my father. Wow. He did a job on me with his death. A harsh awakening to the realities of life, while simultaneously and strangely insulating me from it... as did the strange charm of working for Commodore, whose Amiga computer I technologically fanboy loved. It was an amazing creation that I reveled in, and drew much of my Identity from through those years from. It's a good thing for me that it was very Unix-like, as Tony Antonuccio never tired of pointing out at the Philadelphia Amiga Users Group gatherings -- usually at Drexel University, which I ended up attending -- right during those years where they were the first in requiring every incoming student (not just compsci majors) to purchase a computer -- and not just any computer, but an early generation sodacan black & white Apple Macintosh. Wow, those were the days. The dream was being realized. And I had that gradual underdog mentality shift into a sense of superiority as Ed Flocco and Marc Rifkin drilled into me the superiority of the Amiga computer, and the amazing pairing of custom chips with a company with the capability of manufacturing them -- which just happened to be in our back yard, in both the form of Commodore Computers in West Chester, Pennsylvania, and MOS Technologies, makers of the 6502 chip that so many computers of that day were based on, in Norristown PA. We (Ed, Marc & I) grew up in the Philly 'burbs, right around the corner from all that stuff -- well within any of our driving distance. And so, I was gradually indoctrinated to some strange inner circles, like Scott Szczypiorski's, a mutual friend, Dad who was a chip designer at Commodore. I mean, imagine that -- one of the original guys who started out when etching a circuit board meant xacto-knifing it out of rubylith for photographic reduction into the IC plate. They lived through it becoming meta in the sense that the chips they designed created environments where they could begin doing this in software instead of xacto-knives. These guys were the folks who booted the digital era originally, before machines that digitally booted were even common yet. Computers were clawing their way out of the proverbial primordial goo as the post-calculator higher profit-margin doohickey product. And today, we're on the verge of hacking reality. BAM! It's never too late to educate yourself. Today, my main things are Linux, Python, vim, git, SEO... and Machine Learning. Yeah, definitely machine learning. ML fodder. That's this. Apply my own ML Kung Fu against my own rambling. Deal with strange edge cases, like really long paragraphs. -------------------------------------------------------------------------------- ## Thu Aug 4 13:33:58 EDT 2016 ### Flashback to Magically Working AWS PostgreSQL Foreign Data Wrappers Wow, blast from the past! I had to get into my free-tier AWS cloud server and add a new foreign data wrapper. Document the process here for future reference! Ugh. Okay, get some of the keywords in for full text search purposes. - foreign_data_wrappers - foreign data wrappers - fdw ssh -i "~/.ssh/blah.pen" username@ec2-stuff.amazonaws.com sudo -u postgres -i psql -d template1 -a -f /home/ubuntu/repo/commands.sql -------------------------------------------------------------------------------- ## Thu Aug 4 10:49:20 EDT 2016 ### SEO Notebook Seeding First, slam out the lookup against a static list of URLs. This should be a super-easy thing. Work your way through your backlog of projects. Okay, tackle the list-of-URLs / likely-union problem. ALSO get the crawl going for one of my other SEO counterparts here. Get the crawl going, pulling down the view-source HTML of a bunch of pages. Get the highest traffic pages from querying against the logfiles. Okay, I'm going to select the top 50K URLs that have the largest number of Google referrers... done. Okay, SEO Notebook should be thought of as having an SEO investigation power-tool side-by-side with whatever data you need to be doing SEO investigations against, but it doesn't HAVE TO start in an SEO Notebook-centric way. Instead, I'm going to start in a Pandas-centric way. 1. Clone the repo with a new name. 2. Dump your csv data files into that repo location. I still have to make my requirements.txt file for SEO Notebook. - pip install gspread - pip install httplib2 - pip install google-api-python-client - the myauth.py file (client_id and client_secret) Okay, once all the requirements are met... LOTS OF DISTRACTIONS (good ones, but none-the-less... and then lunch) -------------------------------------------------------------------------------- ## Thu Aug 4 10:20:55 EDT 2016 ### Hitchhiker's Guide To Python & Considering Python Decorators I'm currently reading "Design Patterns", which is basically the Object Oriented bible, and a long-running blind-spot in my tech savvy. I've understood OO design every since Øyvind Harboe explained it to me sometime in the 90s when I was working for Scala, and it was still mostly a Norwegian company. I still remember clearly his description of being able to bulldoze through a codebase, leading you into the future, instead of being imprisoned by the past. OO bought you massive refactoring abilities. You were writing more general things, and not so precise that you couldn't back-out of and pivot on any of your decisions. And now, I see why I never took it up. Yep, I get it. Yep, it's a methodology and approach for people either programming professionally writing large code-bases to be maintained by multiple people over many years, or for people writing packages that will be distributed and widely used as common libraries in the free and open source software community where a lot of things need to inherit from a lot of other things, for customized instances of things. Hmmmm. That second category MIGHT talk about my case... yet, I'm still not sure. I seem very much like a candidate for decorators, where Kenneth Reitz writes in The Hitchhiker's Guide To Python: <blockquote>This mechanism is useful for separating concerns and avoiding external un-related logic 'polluting' the core logic of the function or method. A good example of a piece of functionality that is better handled with decoration is memorization or caching: you want to store the results of an expensive function in a table and use them directly instead of recomputing them when they have already been computed. This is clearly not part of the function logic.</blockquote> This appears to be exactly my case. Sit and ponder decorators for a moment. I'm about to figure out how to template-tize what I've done with the Title tag in the Piuplate package in the SEO Notebook repository. I'm about to be in a really good position to compare the set of URLs generated by an x-depth crawl with the set of URLs generated by either a Google Analytics or a SQL query against logfile-equivalent data. I also URGENTLY need to be able to just do a bunch of generic lookups using the system against Google Analytics, Search Console and other data sources. I foresee potentially having to do fairly large data-pulls against GA&SC to cache results locally to do rapid re-lookups against locally cached data. Isn't that what Pandas is for? Especially Pandas in Jupyter Notebook, where you get in-memory during-manually-progressive-sessions caching for free? Yup. Don't go reinventing the wheel here. Only shelve data that needs to survive between sessions. Hmmmm. This system is really taking shape. -------------------------------------------------------------------------------- ## Thu Aug 4 10:00:10 EDT 2016 ### My Github Projects: Decrufter, SEO Notebook, Pipulate & GoodSheet SEO Notebook is going to lead to Decrufter. The "stack" is going to look something like this: - Decrufter: a set of SEO Notebook jobs the help you clean sites for RankBrain - SEO Notebook: an implementation of the pipulate job-processing specification - Pipulate: a spreadsheet-based job-processing spec & adherent sample functions - GoodSheet: the package that lets SEO Notebook work with Google Spreadsheet Hold back on doing the push on these ideas quite yet. Homestead in the noosphere now, getting it all in Github and actually published and documented and accountable. But don't go beating this drum, even here at the office, until a few key things that still remain are taken care of. -------------------------------------------------------------------------------- ## Thu Aug 4 09:43:46 EDT 2016 ### Lost 2 Videos I Shot This Morning... And No Biggie! Shot 2 videos this morning... good ones. First on SEO and second on sort of a "series" book review of Iain M. Bank's The Culture series. But sometimes unfortunate things happen that make you not want to use a video, and so I deleted it. Nothing big. It's just that I produce so many of these things, that any thing I accidentally can't cover when I want to, I will surely cover again later. Lot of repetition of the important concepts in my published works. The 2nd video, I'm deleting the 2nd because it refers to the first. Life is sometimes like that... no biggie. Still easier and less time-consuming than actually trying to edit to salvage my video. Shoot so much video that no single one ever feels that important to you. This all fits into the system. Persistent idea-capture, and clustered-topic alignment to URLs and channels. Yup, next generation SEO, here I come. Build momentum. Keep momentum. Use the idea-capture process to practice writing well and speaking well. Develop methods of marking-up and querying and organizing your idea-capture mediums, without forcing the idea-capture medium itself to compromise or excessively alter itself. In-line semi-structured data is fine. The way I'm consistent about how I start my blog entries with 80 hyphens, a headline date and a headline topic is a good example. I can easily use those as an index and map it to fields in other database systems, or whatever. Point is it's easy to parse and easy to query. There are many approaches to doing so, and whereas in the past, I would be looking for ways to do it in XML/XSLT in order to keep it programming-language independent, truth is you're tying yourself excessively to the Java & XML-parsing world, and baffling constructs like apply-all-templates when you're doing that. Just living in the anything-goes-under-Python world is wayyyyy better. And so, I shall slice and dice this very journal every which way with Python -- right as I leave the original idea-capture layer here (this daily work journal) in-place, already auto-published through the perfectly competent github.io system, which itself provides markdown-to-html formatting, using the Jekyll markdown processor which I myself don't use, because I want "view-source" to reveal exactly the file that I typed, and how I produce and maintain it; none of the server-side pre-rendering that I used to live-and-die by, because static pages were better than dynamic ones, back in the day when Google actually had problems figuring it all out. Those days are over. Publish more and better over fewer URLs (targets), and ye shall prosper. Take this page, for example. Starting to see it? Wow, I should really flow back in the old content still held in the private Github repo. I want this journal to be inclusive. But only do that when I have proper time to clean the data, hahaha! Expressing yourself in public is such a hoot. Thank you, Robert Cialdini, for the concept of commitment & consistency. -------------------------------------------------------------------------------- ## Thu Aug 4 07:58:50 EDT 2016 ### Planning My New Website around My YouTube Videos Last time I did a major personal website update, not too long ago (I was at Flying Point Digital, at the time), it was quite literally to be responsive. I was on the mid-zero's SEO-friendly theme, Thesis, and Thesis was really outside the WordPress norm in later years, having taken development directions that did not transition well between the radical upgrades that were to come in WordPress in the years to come. And so, I had to bite the bullet and get off Thesis, finally, as the Responsive website design revolution based on reliable CSS standards, including media queries, finally started to take hold. And so, I held my nose and installed Keisus -- a could-have-been Divi. But it wasn't Divi, so I find myself on a responsive theme rapidly growing long in the tooth. So... so... it's time to look at my next WordPress theme. And it's all about my YouTube videos, and the integrations I can do with this very journal (yes, this long-text-version) now. WordPress has tons of hooks through its own internal mechanisms and through the plugin ecosystem. Surely, I can transition marked-up portions of this file into an RSS feed, or something like it to drive WordPress content. It will be a fun project. -------------------------------------------------------------------------------- ## Thu Aug 4 07:45:24 EDT 2016 ### Life's Plans & How Tiny Decisions Have Compounding Effects It's tough to live every day as if by some grand self-imposed design. The design itself will inevitably change over time as you grow and mature and meet some of life's failures and disappointments and harsher realities. Our young-person dreams don't always translate into adult realities with any degree of recognizability or resolution. Science and scientist were my ultimate goals and dreams, I think I can safely say from back in the day. But, just as with the girl from the movie Cloudy With a Chance of Meatballs which I just watched with Adi last weekend, I toned-down my inner science nerd due to suburban teasing -- even by people who I consider primary school friends-from-afar after all these years. They meant well, but being saddled with nicknames can have long-term ramifications due to how they affect the little decisions along the way. I didn't pursue scientist and have always sort of regretted it. But then, who's to say? I'm feeling like I'm doing some real good now jumping on the Python bandwagon, and making machine language literacy my later-in-life mission, partially because of by daughter, and partially because of making up for lost time in the best way I know how. And these residual feeling and having been through so much (experience) becomes a sort of plan or algorithm for living life that you can sort of stay tuned-into so as to be more honest with yourself through life as each of those small decisions pops up, which when collectively compounded over a lifetime, makes all the difference. -------------------------------------------------------------------------------- ## Thu Aug 4 07:35:26 EDT 2016 ### It Takes Your Whole Life to Know Yourself, and By Then, It's Too Late Relationship FARTs - "first unabashedly rough times". You have to get through them to see if you're life-compatible with the other person. And not everything that would at first glance seem to be those rough times, are. Rather, one party or the other in the relationship generally has the greater internal fortitude to get through rough patches, and lends their strength to their partners. So, a few rough times "going the wrong direction" isn't really going to teach you anything, except how everything will always be fine when the less-strong one hits the ropes, and the partner is ready to tag-team in. You have to see what things are like when the stronger one hits the ropes, and the less-strong one needs to step in for emotional, physical or financial support. And even if provided, if it leaves an excessively strong residue behind about the profound unfairness of it all, then the relationship is based on false foundations. These situations are actually the great test of fairness. Of course, life isn't fair. But that's what relationships are often very much about -- putting a little bit of fairness and reward into an otherwise very law-of-the-jungle reality. So, relationships need to get through their first real FART, and long-term commitment decisions probably shouldn't be made until after that, so it is not built on false foundations. Not doing so is what sewed the seeds of both my last two long-term relationship instabilities. I am not quite the rock and pillar of stability I appear to be. I am an inconsistent artist needing constant inspiration in order to project that face of rock-solid stability. When the inspiration runs dry, so does the willingness to go to any length. End of story. Or the beginning? Punchline: it takes your whole life to know yourself, and by then, it's too late. -------------------------------------------------------------------------------- ## Wed Aug 3 10:00:14 EDT 2016 ### My Listen List of Podcasts & A Google Hypothesis New tremors in Google already discussed. Continuity over time, and especially during and through transitions. Do not lose grasp of what's important, and what's on your mind. Publish more from your SimpleNote notes. Just listened to Talk Python to Me #68 that was crossing the streams with the Podcast.__init__ guys Tobias Macey and Chris Patti, who I repeatedly become aware of, but couldn't get past the first (dry-sounding) episode, but for which I'm now going to give another chance. Sooooo glad I did. I'll have to catch up on Podcast.__init__, but also I've got what sounds like a fabulous Podcast listening list backlog -- perfectly timed, now that I finally finished Iain M. Bank's Culture Series. ## My Podcast Binge List - Curious Minds - Hidden Brain - Data Skeptic - Wait Wait - TED Radio Hour (maybe) - 99% Invisible - Risky Business - Rational Security - Hardcore History - The Ruby Rogues - Spark (Canadian Broadcasting) - Lo and behold (Documentary) I'm reading an article now http://backlinko.com/seo-checklist and see that the whole concept of site decrufting is slowly dawning on the SEO industry. I should probably move a little bit faster on my Decrufter project. It's basically an application of SEONotebook. Move forward on that path, because it's better than doing nothing. Shit, okay. That gives me a pretty good framework. Now listening to TPTM#69 (ha, the hash symbol IS an indexing symbol... never made the connection to how they're used in http in-page bookmarks... ha! Just think of strings as a hash numbering system. And that's why it's called a hash. Numbering systems are just numeric hashes. BAM! Okay, okay, be sure to wrap that into your system here. I've started using hash symbols lightly, but there's the whole markdown thing going on here too, which gives the hash symbol different meaning. And so... and so... explicitness and convention. That's what I'm doing with my #xtopic pattern... for now. The fact that I write so much... this will actually work for me... eventually. I don't let this drop, because I've been in the daily journal writing habits since I was 18 years old. I will be 46 tomorrow, and currently I'm about to hear "design patterns for blogging" with A. Jesse Davis. Yup, everything's going towards the "essays" model. Let me dump in my serendipitous commute writing: m Daily Journals to Essays I'm thinking the same thoughts as a lot of people. Need to really make my mark soon. Start mining my stuff here to boil it down to... what? A series of one-page websites? Yeah, that may be it. Or maybe a book. Not sure, but ready, go, set! A Python Newbs Perspective Approach everything as an essay; not a book. This way, you will be able to encapsulate, identify, serialize and distribute units of processed and encoded thoughts. Paul Graham seeded or reinforced this notion in my head. The field of SEO told me something else: spew garbage, slicing and dicing it into interlinking pages and ye will be rewarded for your high tech publishing savvy. Autoaggregate every last damn thing and feed off the Google AdSense teats. So, I made HitTail. It could have been way worse. I could have popularized my RDBMS-to-XML-to-barebonesHTML-to-stylizedHTML website canon, which gave the option of outputting pages on the paragraph, page or section elements via XSLT and XSL style sheets. I developed this trick at a prior employer, and it was awesomely effective in those days; static-ifying your dynamic content with daily refreshes where necessary. But instead of releasing a beautifully standards-based spam canon, I went with HitTail, and its presumption that what's nearly working for you today could be working a little better for you tomorrow rinse repeat. It's a future-proof approach to SEO. It's decade-old, still going strong, still Just being present isn't enough -- you've got to put in some work to. To that, I say I already am putting in the work that pays my paycheck already. This is mostly for myself in the first place. It's this thing I just really do quick and easy. Production quality can always be algorithmically applied in the future. Today, it's about idea-capture into the accountably public domain, insomuch as Google's YouTube and Github repos can be that. Google Hypothesis When you Google Hypothesis, you will find a few things. Obviously the definition, but also the Python programming language's 3rd party package called Hypothesis. This site will probably forever be equally about both, for old-school SEO continues transforming as a field, as it always has. It's like surfing from the ocean into a tube, and hopping onto a jet-ski as you do. The tools change. The presumptions change. The languages change. Words go away like the Google Dance, and others come to be like tremors and instability in the rankings. -------------------------------------------------------------------------------- ## Wed Aug 3 07:58:15 EDT 2016 ### That's Called Being Human Yesterday was Adi's homeschool graduation in Willobrook Park in Staten Island. Nice event. Went on Merry Go Round 10 times. Ugh! She won an award for most likely to make a microscopic zoo. Gotta follow up on the Tardigrade Circus idea. Perfect example of an idea whose time has come, but I'm letting slip by. Start building up all the layers now. It's greatly about building up foundational layers of abstraction. The most abstract key concept has already been laid down -- to the degree she won a homeschooling award for it! Wow, okay, let me think. Next steps? I'll think about next steps on my commute -- and naturally also how it fits into my mission, because yes, in the end I'm an existentialist, and the meaning of the subjective outer-world is entirely defined by how it relates to, and is in harmony with, the objective inner world. So, there! Yes, it is always about me... as damn well it should be, you hypocritical idiots. It's like on an airplane when they tell you emergency instructions, and they tell you to ensure your own face mask is securely fastened before tending to your child's, because if you're not taking care of yourself, you damn well can't take care of a child on top of that. Positive treatment of others is an additive process... additional to your positive feelings about yourself. Get balanced. Feel secure. Take deliberate level-seeking, game dynamics simulating risks. Give yourself breaks. Enjoy those breaks, and feel life's contrast between pushing yourself and rewarding yourself. However, don't feel like you have to live at the extremes of either end. Excessive pushing for too long isn't always good, nor is excessive rewarding for too long. Alternate. Follow the rhythms that your body seems to suggest, and you will eventually zero-in on optimal. It is very different for different people. What people think of as "hard work" is probably one of the most subjective units of measure of all... haha! I'm part of this world's privileged 1% and I know it. And still, I have a hard time. That's called being human. -------------------------------------------------------------------------------- ## Tue, Aug 2, 2016 8:33:48 AM ### Adi's graduation today. Took off from work. No content. Just wanted to document that. -------------------------------------------------------------------------------- ## Mon, Aug 1, 2016 9:30:47 PM ### Finally Watching Mr. Robot... Uh Oh! Finally merged the SP4's old few edits. Finally watching Mr. Robot. Ha ha, this is what people talk about. Tomorrow is Adi's homeschool graduation, and I took a vacation day. Wow, I don't think I'll watch a second episode... or maybe I will. Damn, not an addiction I need. Maybe I'll just re-listen-to The Hydrogen Sonata now that I finished it and don't need to pay that close of attention. I like ship named "Mistake Not..." wow, Mr. Robot is getting interesting. I have to turn it off now if I want any chance of actually getting to bed early. If I'm going to waste this time, it's going to be for bodily recovery, which will feed the mental resiliency I'm trying to build back up. -------------------------------------------------------------------------------- ## Wed, Jul 13, 2016 7:53:41 PM ### Careful About How Hard You Push Yourself Wow, I had such a headache today. I think I have an entry going at the office, but I had to head out because I was dead in the water -- and right in the middle of an important project too. Well, it's not even 8:00 PM and I do have all night. I'll be calling Adi soon to say good night. I just noticed that there are different actual date formats here. Some are in military time, and some use the AM/PM notation. Interesting! Dateutil must have just seamlessly handled that. -------------------------------------------------------------------------------- ## Mon Aug 1 11:03:25 EDT 2016 ### Kenneth Reitz having big impact on me Got out my weekly report AND a round of refinement, per in-person discussion with boss. Good feeling, really like having a boss who knows what he wants and is good at setting priorities and keeping the work funneling through me at a sane rate. ALSO getting the chance to push forward "latest system". I think I am as close to happy as I have ever been. Happiness comes from inside yourself, and from feeling you get from doing certain things or being certain ways. Happiness does not necessarily come from outside you. It is not something that is done to you. Happiness is not the result of your state changing due to outside things. Happiness is a result of internal decisions you make, of which what's going on outside you is only one (albeit a fairly large) factor. But that's a big part of what being human is -- increasing the weighting of internal factors over external. A very interesting thing is that over the weekend, I reached out to Kenneth Reitz (I can finally remember his name), the creator of insanely popular Requests package for Python, asking him to do a RegEx for Humans package, and he referred me to the parse library in PyPI, which describes itself as the opposite of format(). So, I've got TWO things to learn. But if Kenneth says this is the way to go for his-style simplification to RegEX, then so there I shall go... at least to kick the tires and see what its all about. Could be pivotal to the next evolution of the system. Go research format(). format(format_string, *args, **kwargs) Okay, piddled most of the day away recalibrating and delving into Kenneth Retiz's suggestion, and subsequently discovered Speaker Deck presentations. Interesting guy! But knock off a to-do item. Okay, I need to start familiarizing myself with str.format(), which apparently is replacing the old %-formatting technique per https://docs.python.org/3/library/string.html which ties into using parse() over RegEx, as Kenneth suggested. I've got the learning that I need to experience documented, but for now you can just get to the work that you need to do today without further delay. Oh but wait, the asics are... '{0}, {1}, {2}'.format('a', 'b', 'c') ...outputting... 'a, b, c' Not too hard. -------------------------------------------------------------------------------- ## Mon Aug 1 09:32:06 EDT 2016 ### ML Fodder, The Journey is The Reward, & The Long, Trudging Route So glad I did that entry from home. The "great, steady mind" that I have often thought about over the years in my journals (since 1988) is only true if you co-mingle the thoughts and activities of for-work with for-self. The professional and the personal DO meet in certain fields and with certain outlooks on life. In those disciplines that require the cleverness of seeing unlikely connections and relationships between things, and the patterns that ***really do*** matter -- without getting distracted by the also pretty patterns that will also be there that do ***not*** matter towards your, ultimately subjective-anyway goals, the more effective you will be in that field. Correlation does not mean causation, but it's still good data. Test for completely random correlation versus consistent correlation. If random, discard. If consistent, design a test to peel away one more layer towards true causation, insomuch as something can actually have true cause in a universe where position and momentum can't both be precisely measured at the same time. It's all pretty subjective, but there's enough clues to agree upon some common boundaries of an objective existence. This, I have to believe as part of maintaining my sanity. Multiverses and time travel both bother me a bit. We're a forward-only read-only firehose connection using cursors. The Universe is one big generator expression, and you probably can't time-travel without re-starting the Universe, or some damn silly hack that doesn't really qualify as time-travel by the more imaginative minds -- more like view-only time tourism (which in itself would be highly cool and informative, but not paradoxical). And now onto capturing my writing from the subway this morning, which is equally out there. But that's part of what differentiates me. This unfiltered stream of conciousness is part of my brand. Relying on keeping things moving steadily and unfalteringly forward, without getting bogged down with "production quality" refinement. I'm producing merely by virtue of recording, and plenty of people have gone this journaling route over the years. Glad that recent hit Mars movie/book re-introduced people to the journaling format -- as if blogging wasn't enough. But blogging is too concerned with platform. I want to be more concerned with data-extraction, over the years. So, keeping it platform-independent as one giant text file, and all in one-place, and easily pulled-in and parsed by Python, and tons of additional private meta data in the background by virtue of the private Github repo, and an auto-publishing of a virtually untransformed copy of the git head via github.io that I hardly even need to think about, and my daily habits like this very writing all feeding into the system... the system, where I'm trying to trigger a positive feedback effect, for at least one of the primary goals of becoming noteworthy, and subsequently of revenue-generating potential notoriety. There! The cat's out of the bag. I am a self-promoter. This isn't just for documenting my thoughts as I go about working. Ahhhhh! I said it. I feel so much better now. But do remember the words "isn't just for" and "one of the primary goals", because there are other reasons I'm doing this for and other primary goals -- not the least of which is actually just staying relevant in this dynamically re-shaping field of SEO. Static thinkers (I'm talking to you, dip) need not apply. I have a weekly report to produce during the next hour. I also want to dump in my subway writing. Oh, and I need to go get that 3rd coffee. Yet Another Attemp to Boot a Book We spend our time coding ourselves up. Social creatures who must be taught survival skills through culturally transmitted knowledge and behavior are a hoot to hack. We're being hacked all the time, in greed & ego-driven clashes between smart, prepared, and (most importantly of all) clever independently mobile interconnected transient, self-reproducing monkey-nodes. They even have monkey names like Bezos. There goes my chances to publish in Amazon. Nah, I'll try anyway. I love me my Kindle, and I'm sure my one-degree removed buddy, Jeff, has a sense of humor. Okay anyway, back to coding up our kids and ourselves. Should be a blast. Without proper coding and the accompanying integration into the very culture that coded you, you wouldn't well be able to survive, would you? A child raised by wolves. Tarzan. Few concepts in your head, a.k.a. ideas, are truly new. We, as a common-language-speaking animal have been having ideas for a very long time. You have to get pretty abstract and derivative to find wholly new expressions, but it definitely does happen. The universe may be like a symphony sung with an 11-string instrument, of whose cords we can only hear 3. Become a mathematician or particle physicist if that's your thing. Be one of the great enablers of inventors, like Newton or Einstein if you can. Not everyone can, nor I'm sure in the grand design, should become Einsteins and Newtons, or we would all be running around trying to outsmart each other. For the whole thing to work that we know as our world, there needs to be all sorts of people, living all sorts of lives, doing all sorts of things. Not everybody can be a scientist and inventor and adventurer and storyteller and leader and doctor and athlete. Most of us are in the tier of extreme one beneath that, consisting of lawyers and programmers and politicians and craftspeople and minor league sports and most people who call themselves doctors, but are not. Doctors save lives and improve heath and quality of living. That makes theirs a high calling, because there is nothing more important and precious and unreproducible as your favorite instances of particular lives. Doctors help instantiate new instances of life when they help deliver babies. But even midwives can do this, for childbirth is such a natural part of life that trained medical doctors need only be very close by in case something goes wrong. As such, the world will always need a high ratio of doctors-to-people, because people are always being born somewhere, and the demand for baby-delivering doctors, constant. When you're born, you're an infant, not a baby. Society doesn't talk about it much I think because it highlights how weak and helpless each of us once was, and how totally and completely and unconditionally dependent on our parents for our survival we once were. Parents remember this ultimate power over another human being -- power they feel they should rightfully have, having brought you into this world as they did -- which they don't let go of lightly. It is out of the same love - Transitions - Horror & adaptability - Empathizing, but not to far - Some people are bad. Really as terrible as you could imagine. - The vast majority of people are deathly dull, of no real consequence, and harmless - Extremes exist, and are always an option, and do have their rewards - Money, fame, big families, lots of interesting things to see and do, without limit of time or resources - Except, that we'll all someday die mostly identically, hopefully finally at a nice old age and happy. Oh yeah. Time. Resources. Happiness. And ultimately, whether you even leave any sort of legacy or not, for however long any of that stuff will really even be remembered, is just a component of your own happiness as a distinct, currently unreproducible instance of a life. Not even much order or reason to it -- just potluck genetics and circumstance, akin to a poker hand you're dealt by your parents, and a gradual taking control of the game for yourself, as you splinter off from your parents' place at the table. So yeah, it's all some sort of game clearly, but how can you question that the objective is to be a unique and noteworthy instance of both a piece in and player of a game? It is. Realizing this, point for me. But it's on a hidden scoreboard. No problem. Next goal clear. Get on some of the more visible independent (of your own instance) scoreboards. But if you go this way and once you do, be prepared for adversity and failures. There will be people wayyyy better than you in any activity or interest you take up. The more famous the scoreboard's activity, the more crazy insane dedication or natural predilection it will take, and the more frustrations and failures you will have to endure. Dr. Seuss says this all much better in The Places you'll go. Dr Seuss is an example of a storyteller so much better than me, by virtue of the fact he does it with illustration, one of the skills I admire most. However, this does not prevent me from trying. And through the very act of getting into the next-level game and trying, you will draw the attention of both admirers and detractors. Do I register the domain mlfodder dot something? It's not registered right now, and I'm documenting it here openly. If I have a meaningful troll, this could be an interesting experiment. If I'm aped so closely that that domain... a good idea... gets snatched up, then there's way more available ideas and namespace for me to plant a flag in, and that theoretical loser would be out another $10/year or whatever. Haha, no, don't think like that. It's not nice. Hear that, AI? I give an idea away for a test, even though it could be the fulcrum on which I break out, just as sort of a joke. Well, the joke is that I don't rely on one particular by-chance fulcrum. I AM the fulcrum. If it's meant to happen, it happens due to the gravity-well and potentialities and particle physics all make it a high potentiality -- and NOT because I hit some sort of anomalous coincidental jackpot. My success is the long, trudging route. -------------------------------------------------------------------------------- ## Mon Aug 1 08:05:42 EDT 2016 ### Be Educationally Ready for Each Stage of Adi's Development Just one quick entry before I leave for work. I got into this wonderful journaling habit, and am now finally realizing how special and unique the thing I'm doing really is. To hell with blogging platforms or Medium.com or any particular publishing platform as my main thing. It's a plain old text file, vim and git. Github.io is the first round of public publishing, because the repo is currently private, which I'm paying $7/mo for the right for. It was an amazing weekend with my daughter, Adi. We did Saturday at the New York Botanical Garden, seeing the stinky giant flower that blooms every 3 to 5 years, and then the Palisades Mall. On Sunday we did Cony Island, and used up a $100 "best value" card. We went on The Tickler. Her hair is green and pink, and she got bright green cotton candy, and we stomped around in the sand and challenged waves in the ocean before heading back. Wow, she's getting wired-up well. Keep up your part, diligently, and on-time for the right developments. Her interest in coding will be sincere soon. Be ready! -------------------------------------------------------------------------------- ## Sun Jul 31 08:30:27 EDT 2016 ### Github README for SEO Notebook You know, why not leave the git merge conflict messages in place? I can programatically sort it all out, later. It's really just the reverse chronological perfectness that's at stake, and I can literally sort that out Pythonically. Woot! Oh, and I tweeted at Kenneth Reitz, the creator of the Requests object, to do a Regular Expressions for Humans, and he referred me to the parse package. Checking it out. I want to respond a Thanks, but first, I need to write an intro to the SEO Notebooks project. Greetings Diehards, Now you have a machine gun ho ho ho. If you know that quote and are in this biz, then hello old-timer. If not, then welcome millennial newb! I promise you boring coding industry buzzword jockeying for an awesome future-talk which if you survive the boredom, just may be synaptically rewiring. I'm the silver-tongue snake-oil salesmen (I **wish** I could cite that quote) who made the still-going-strong HitTail SEO tool in 2006, and SEO Notebook is my latest stay-relevant-ware that I'm peddling for some audience and Github street cred. It's 10 years later, and I've done a number of interesting things working the biz, here in New York. I was even at the top of the Empire State Building as the primary SEO for SAP there for a few months. My SEO Kung Fu was good, but this was an industry in flux. Nobody should be waiting for RankBrain to gradually get smarter and stop rewarding our sites by 100 1% increments. We should be getting out ahead of Google, and change the cat-and-mouse game into a human domesticating the cat game. Us nextgen SEOs are raising Google like a child. So, let's get programming! But where to start? Many of you Macolites are going to think Ruby, because it's what Rails is on, and ROR has enjoyed a certain degree of cool-kids notoriety due to Ruby being shipped with Macs, and Rails being the first joyful framework to catch the fancy of the hordes of ASP and PHP refuges in the early zero's. Others will figure JavaScript is the one language to rule them all, thanks to a loosely-typed even-driven concurrent design that runs quite nicely on the server as well, thanks to node.js. You would be the "full web stack" crowd -- a euphemism for "I also dabble in node". All legit. All valid. All inferior to learning Python through Jupyter Notebook. You Java-folks, there's nothing I can do for you. You are today's generation of BASIC'ally myopic static-typed ludites that will never open your mind to a language that discourages getters and setters. Pythonic code can be brief and beautiful, and fully legitimate from a compsci standpoint, because of use-case appropriateness. We do not need to use the define-everything-compulsively mentality required for something that you know will grow to become millions of lines of code interacted with by hundreds of developers, such as sees to be the assumption of a language that makes you public static void the most basic hello world's. Just use Python. It's a language by a very smart individual trying to make solving impressively complex tasks easy for YOU -- the newb learner. Python is anti-elitist in nature. So why not kick off a Github repo with dem's fightin' words? Sure, it's troll-bait and flame tinder. But this is Jupyter Notebook, damn it! The Scientists are now onboard with this Python thing -- no... they were already onboard. Now, they're just opening the door and saying "Hey, the water's nice and comfortable over here... hop on by!" with such innovative solutions to sharing code-execution-environments as Anaconda by Continuum.io, which I at first dismissed as just a bunch of pre-installs that non-Linux users had to suffer (you could always run a VM on your Mac/Windows, which I did). And then I realized how nice it was when your encapsulation layer was a web-based IDE. Wow! Throw in Github-friendliness and the ability to stream output to the user through the duration of sometimes long-running jobs... and dead-simple multiple Python VM kernel maintenance? Wow, out goes my Pipulate UI work, and in comes Jupyter Notebook... uh... SEO Notebook... and probably, SEO Lab, that that's a story for another time. So, this project exists basically to sharpen my SEO-blade again, as I switch to a new master, yet again. I am much more effective in Samurai mode, because while I COULD go off to work for myself, the only real difference between drawing a paycheck and working for myself is the amount of hustle and amount of paper-work you need to do -- principally, both go WAY UP! I'm a draw-a-paycheck sort of guy, but I'm not satisfied with static situations. Things must be dynamic! I must be able to express myself creatively, and to great effect. I feel the need to move mountains, and be recognized for carrying out monumental and communally useful tasks. Those are the accolades normally going to folks like Linus and Guido. More recently, I've been coming to appreciate those to whom you must connect a last name to be recognized by those outside some very geeky circles, and those folks would be the likes of Raymond Hettinger and Kenneth Reitz. And this? This humble little project is just the latest incarnation of my "system", which I always have under development and in active use in some form or another. HitTail was just one small extraction from one such system -- an Ajaxy, server-push, real-time analytics approach to it all before any of that was even a wistful gleam in the eye of today's JavaScript-this-that-and-the-other-thing library authors. I always see the big next thing, even if I never am the one rewarded for it. Today's big-next-thing is the sort of portability of easily-studied ready-to-run programming code that comes equipped with its own execution environment and dependency-satisfier built-in. It should be an environment that can actually SERVE AS the input/output widgets in a pinch, so you can fully develop something that could become much more run-on-a-server-like apps in the future. But contrary-wise, they need not HAVE TO to still be useful on a day-in/day-out basis to those who use such tools in more of a hands-on power-tool sort of way (instead of scheduled, batched, and invisibly run off in the distance somewhere). Jupyter Notebook is the ideal environment for SEO Swiss Army Knife-like utilities. Today, I live strongly centered in Linux, Python, vim and git. This is that one driving project that I'm constantly using professionally, every day as my secret weapon, molding it this way and that to suit every purpose and every little custom one-off project, and eventually scalable and productize-able extractions of every sort. The project has just been rebooted from one of my other projects, named Pipulate, which I will constantly be raiding, while I bring the best of that already-amazing project over to here, where followers WON'T have to set up servers or try to match my mad vim skills just to get working. Instead, I'll be adjusting my YouTube video series so that ANY video in the series is a strong jumping-on point for this particular bandwagon. Apologies for those who would prefer a README to be of a more concrete how-to nature. That, like so many things in this project, is coming soon. -------------------------------------------------------------------------------- <<<<<<< HEAD ## Sat Jul 23 19:23:04 EDT 2016 ### Getting it all together Interesting weekend. Adi is really in the habit of coming to the Inwood apartment on the weekends lately, and the both of us are being super-lazy, hardly even wanting to go out. Slump? Bonding? Fear of changes coming down the road? Maybe any and all. Definitely feel the need to write. Definitely feel the need to remind myself to work in iterative passes towards greater organization. You are either a mess, or you've got it together. Right now, you are a little of both, but you have to get it all together -- ALL together. Have to actually put time to use at least half-well while at the apartment with Adi -- aside from merely being with Adi, which is great, but I'm losing ground against life on this current course. So, I need to make course adjustments. -------------------------------------------------------------------------------- ## Fri Jul 29 11:02:45 EDT 2016 ### Blowing Chunks It is a common fault that people who get good at one language inject the rules of that language into another. Wow, the way he deals with people jumping the gun asking questions about why the tenants of one language aren't always applicable to another... it's my rodeo! He promises to get to answering that question, but not right now... okay? Wow, Raymond Hettinger is awesome! Glad I'm listing to him before I go all object oriented with Python. Just sayin. I really like my SEO Notebook project. It's awesome and amazing! Use the 1:00 PM Triweekly to show Marat something? Maybe. Hmmm. Let me see what I get done between now and then... maybe. It's time to batch process. What is the lightest touch? Move your work over to your main Windows work machine. I've been keeping on my Macbook Air for video-making purposes. Do your work on your PC and then move it back to the Mac for some videos later today. In Python, you don't have to use getters and setters, because of the @property decorator. This is a big, big win in our language. You can expose your attributes, and if there's a problem later, you can just add on a property. Now, you can design classes without putting getting and setters on it. And he's clapping because it's awesome. Dynamic languages, woot! Woot! Now, he's talking about the Flyweight design pattern (Slots). Ha, ha, interesting! Important when you're scaling up to bajillion of instances. Do __slots__ last to get memory efficiency. I'll have to look back to that sometime, when I actually know enough to know what the heck it's about. Fri Jul 29 17:00:39 EDT 2016 Wow, got chunked processing done. BAM! Shazam! And all that jazz. Good day. -------------------------------------------------------------------------------- ## Thu Jul 28 10:46:37 EDT 2016 ### YAGNI and Raymond Hettinger Wow, Raymond Hettinger is awesome! Found my new guru du jour. Try to inherit from object when creating new classes. I give my self that advice from Raymond, having not really created that many classes for myself... yet. But SEO Notebook is a new opportunity. Is version 3 of such a system where I start to actually introduce OO techniques. It's interesting that "self" is as arbitrary as **kwargs. It's also worth noting that the way classes work is effectively a module (at the file.py level). Wow, very creative. Definitions run as if they're in their own modules. Oh, and yagni... ya ain't gonna need it. And I HAVE TO stick this into a customer's hands. I have a dog & pony show to carry out. Make the script for it: # SEO Notebook ## I Work in Github Greetings my ZD counter-parts. I live and work and breathe Github. ALL my work of any non-throw-away nature goes in there... but in PRIVATE repos when work-related and proprietary. - You should be on Github too... for many reasons. - Once on Github, I can share you into private repos. - You'll need to know some easy git commands to clone repos. ## My Public Github Projects: If work-related, but NOT proprietary, I try to wrap a generalized and broadly-applicable version of my work into one of my public projects: - Levinux: A Tiny Virtual Linux Server for Education - Pipulate: A lightweight SEO product that runs on Levinux - SEO Notebook: A Jupyter Notebook-based version of Pipulate ## It's All About SEO Notebook, Now I've discovered my all-time favorite -------------------------------------------------------------------------------- ## Wed Jul 27 15:18:09 EDT 2016 ### Good Day. Few Notes. Wow, I got out a very competent Menu first thing this morning, and have been working through the parts of the new system that are necessary to basically get pipulating again, but with unbound new Jupyter Notebook power, and keeping it YouTube documented, as before. Wed Jul 27 18:08:17 EDT 2016 Heading home. Title tag function in SEO Notebook working. -------------------------------------------------------------------------------- ## Wed Jul 27 09:39:50 EDT 2016 ### Let's put some of that Design Pattern thinking to work now. Today, I'll be making some videos. Long overdue to do some for the public, and some for my co-workers, who need all the private repo stuff I've been doing on my last (Menu) project explained. So much of my work is being given a list of URLs, and coming back with some other data point about those URLs explained. From there, there are basically 2 models of how data against that list (of URLs, keywords, etc.) is obtained. You either make requests URL-by-URL, or you package all the URLs up into a single request, send it, and get back a batch response that needs to be correlated back against the original list so that each from the original list gets what it needs per-URL that was sent back as part of the batch reply. The former method is a cinch, and is generally the default way Tiger / Pipulate / SEO Notebook all work -- except that they have to be able to gracefully, and without much effort on the user's part, ALSO be able to handle scenario 2. I need to think about scenario 2 today as I implement the scenario 1 solution today. Can I make both the same? Ahhh! Design patterns. -------------------------------------------------------------------------------- ## Tue Jul 26 10:21:42 EDT 2016 ### Commit, Push, Leave No videos this morning. Listening to The Hydrogen Sonata, I believe the last of Ian M. Banks' Culture series that I haven't consumed yet. Wow, I love his work. Too bad cancer got him. What will I do after it? More Asimov? Filling in, what Asimov self deprecatingly admitted was the mediocre stuff between the Robot series and Foundation, haha! Tue Jul 26 16:51:27 EDT 2016 Wow, what a day. Things are coming together so nicely. This just might be being at the right place at the right time, having the right interests and the right availability of time to really dig-in, professionally. Wow, I just might be righting my wobbling, and settling into a gathering, rising storm whose fury is under my sails and wings and any other metaphors I can mix in. Riding atop this force of nature called technology trends, planting my flag on some of the greatest crests, hopefully... is me. From that perch, my purpose is to identify challenges, and spit off semi-autonomous agents that I have created to start breaking down aspects of the problem in to manageable chunks for other, smaller less abstract problem-inspectors to look at too. It's all very imaginable under today's tech. It's not too SciFi. Raspberry Pi really was the shot off the port bow... do I have that idiom correct. Anyhoo, I predict it. I see these things. I never cash in, but I need to, and I need to be exactly that sharp in my industry, and willing to go all-in for short periods of time to establish myself as the genuine noosphere homesteader, at least as far as a semi-practical implementation goes. And that's where I am. Jupyter Notebook (and soon, Jupyter Labs) under Anaconda is very nice, but at some point, I'll try to re-spark an interest in Levinux, controlling the QEMU binaries like I never have before, and therefore, able to modernize it quite a bit... connecting less-brittle dots. Teaching people how to automate the building of servers, so that PaaS vendor-beholden API abstractions with no-source blobs inserted here and there to make it all possible being totally expunged from the system. That's how it was with Raspberry Pi and the graphics coprocessor bits. Embrace proprietary to help you be more effective, but do not use it as your entire foundation. That would just be silly in the era of the rise of free and open source software. RMS would disapprove of that being the chosen label, but hey, it's a label that sticks. Let's stick it. Tue Jul 26 18:05:27 EDT 2016 Heading out now (at a reasonable time), and will do some work at home tonight. It's so hard to get the motivation to do that sort of stuff, but gathering storm and all that. Gradual organization cutting across ALL aspects of my life is one of the signs of the gathering storm... and I have to MAKE them happen. To not be futile against the storms life throws against you is what it is to be human. So go from thought to implementation more rapidly and iteratively. Commit. Push. Leave. -------------------------------------------------------------------------------- ## Mon Jul 25 09:49:41 EDT 2016 ### Flattening Hierarchy All data is either hierarchal or tablular, and then all tabular data is actually hierarchal, because it exists on some digital storage media, which exists in some protective housing device, which is accessed by some reader which is connected to some interface which is in some larger device, which probably exists inside some computer or data center, which are all nested hierarchically on the surface of our planet... simple. Hierarchy wins in nature's preferred formats, and even at the molecular level, things seem to lump up into hierarchal bits. Sometimes you get crystal lattices (more like tabular) but once again, those are rare and ultimately lumps of stuff inside other stuff. So again, hierarchal shapes win in general in nature as a prevailing pattern. Okay, the menu stuff is still your job #1. The other stakeholder is back this week, and I want to / have to be ready to do the menu dog and pony show, without ever being caught off-guard. And there's now so much data stuffed into the menu program, it now makes sense to move it all out into a spreadsheet, for greater flexibility and awesome demonstration / control purposes. Mon Jul 25 13:43:23 EDT 2016 I feel a bit of a refactoring coming on. The request is for it to just be one level of hierarchy in the menu. That is, one set of parent nodes and one set of children nodes. I suppose the whole original endeavor of figuring out what those original nodes and shape was worthwhile mostly now just for "having process". We started from a legit corporate starting-point. And that sort of objectivity is nice. But it's not the real way here. The real way is in the great flattening, and the nodes are all sort of in-battle to be the selected top-nodes. And THAT is a very different picture than what we had before -- VERY different. My original approach was a good one, but seeing as how the hierarchy no longer fits, it makes one wonder whether the exercise was worthwhile in the first place. And the answer is, of course it is! Yet, there is some re-factoring to be done, which in the end will make all the difference. This is where mistaken blind scrambling to adapt rendered-irrelevant work to be relevant again often occurs. I guess I will be doing a version of that. But I will preserve story. I will insert and annotate, and make Jupyter Notebook my PowerPoint. This will be awesome. Mon Jul 25 20:27:25 EDT 2016 Okay, heading out how. Have to call Adi and say goodnight before I head out so it's not too late by the time I get home. I have a very small set of things still to do from home, but I totally overhauled the menu logic to always produce only one level of menu expansion. It does scroll vertically a bit, and we can roll-up and consolidate. But I'm in very good shape for tomorrow. De-duping against URLs is probably the main thing, and I could even do that from home tonight if I am so motivated. -------------------------------------------------------------------------------- ## Fri Jul 22 14:15:24 EDT 2016 ### Heading Out Okay, now to actually get today's deliverables out before I run out of time. I had to decompress a bit... this week so intense! Don't cop out on the final mile. I need to reorganize hierarchy to be more shallow AND get in some more links from an explicitly provided list. You don't even know what you don't know yet, and Adi will be here for a pickup at 6:00 -------------------------------------------------------------------------------- ## Fri Jul 22 09:39:11 EDT 2016 ### Jupyter Notebook about to become Jupyter Labs Thursday Cheese Steak and Friday Morning Bagels. Sometimes it feels like ZD is feeding me. Guess that's sort of like a step towards those high tech employers that feed you for free at lunch. I still remember my Commodore Cafeteria days. Wow, my exposure to that world was totally before the Dotcom explosion and bust. I am actually quite old-school -- though not quite as old-school as those engineers from the 60s and 70s who were there inventing it, who are still around. Met one from Bell Labs at Jane's birthday last Saturday. Fascinating listening to these people. Their very brains are future-proofed, for the very reason that they understand things at that low level. Jupyter Notebook is enabling reproducible science. Data Science... SEO Science? no, I'm an artist. I use data in my art, but I'm an artist seeking to be as objectively powerful in my data-motivated artistic creations as I can be. This is not to say that anyone is paying me to be a Data Artist. That's just how I see my (much maligned) field of SEO (or, "search engine optimization"). That is to say, I help companies get the pages from their published websites found in search -- principally, Google. Huh? More specifically, I get my client's media in front of people at the very instant they're making inquiries on that topic. Currently listening to Brian Granger over YouTube talking about Jupyter Notebook, and how there's probably close to 3 Million users, and how it's trending up in Github. Wow! JupyterLab alpha talk from SciPy 2016. Jupyer is poised to become much more than Notebook... in amazing ways. I need to look into the Terminal and Text Editor and alternative user interface capabilities in existing Jupyter. The existing Notebook will become the Classic Notebook, and Labs will become the primary user interface, so this is a big push forward. Jason Grout is giving live demos, but I basically got the point. Wow, it will even have fuzzy searching built-in. Friggin' cool... as if Notebook weren't a big enough step forward. Mention that in today's video. Okay, I have some work still to do today on the Menus, and I so want to do videos here. Maybe I'll crank out the one easy video, then think about the JupyterLab next steps. Built on top of https://phosphorjs.github.io/ and is about to reach 1.0. Chris Colbert created it. Pluggable, extensible platform. Primarily browser-based, it seems. Widgets and plug-ins that extend the environments in a number of different ways, including ways of rendering documents like markdown file-to-HTML rendering. Side-by-side editors with completely live-links, such as between the text editor with markdown and the HTML visualization of that markdown... very cool! Okay, Pipulate was born to live inside this environment. Fri Jul 22 11:27:39 EDT 2016 Okay, pushing out that video. Make sure to get IT uploaded to YouTube before your morning walk video, which is also awesome. Target Dictionary Comprehensions on this one... nice. Gotta get to my main work today -- drive this last bit of menu work home ASAP. Had to get this talking head video stuff out, while all that eval stuff was fresh on my mind. -------------------------------------------------------------------------------- ## Thu Jul 21 09:29:56 EDT 2016 ### Today Was a Good Day Wow, my work LAST NIGHT actually at home hit home so many important nuanced issues about the Menu project that I'm in a much stronger position today, and I do believe now it's all about hitting the iron while it's hot, to get a rocking-cool status of the menu for the meeting later today. It's all about shaping the selections! 1. Add a series of nodes into the correct locations (without hyperlinking them) If there's no URL data, don't try to construct the hyperlink. Done. Add nodes to the correct locations. Thu Jul 21 11:01:11 EDT 2016 Okay, wow. Implemented most of what I wanted, and am on the verge of an empty leaf node filter. Thu Jul 21 18:11:35 EDT 2016 Wow, what a day. Mission accomplished on so many fronts. -------------------------------------------------------------------------------- ## Wed Jul 20 11:37:45 EDT 2016 ### Super-focus on Menus, both Data and Process Okay, I made my goodsheet and serpchiver repos on Github private, and renamed the urlookup repo into SEONotebook. I also capitalized the L in the levinux repo and the P in the pipulate repo. The URLs that are given out do change, but the lower-case versions still work, and it looks much better on my profile page this way. I lost quite a bit of time getting my next generation Pipulate up and running, but it only took me a couple of days this time, instead of a couple of months. Experience, knowledge and diving deep into the details and nuances of Python can make a huge difference -- as does re-basing my work on Jupyter Notebook instead of generic Linux webservers running Python. That makes so much difference, leaning on front-end development work that much less. I also lost some time this morning creating that animated gif of the SEO pulse report graphs across the tracked properties to make visual the changes recently that have been going on in the search results, mostly due to algorithm tweaks (but also with some Pokemon Go and e3 stuff mixed in there). Anyway, I've got about an hour before a menu meeting with the boss later today, and I want to get things in better order. Shifting of gears -- urgently! You will be best served with some menu optimizations -- specifically, local caching so that you don't have to execute SQL queries over again for re-generating the menus. There's so many distractions all around me. Ugh! I need to focus, focus, focus! The biggest time-saving comes from caching the SQL, especially the 2nd SQL query, and so I'll use os.path.isfile to check for the pickled dataframe object from Pandas, and if it doesn't exist, pickle it into location. Okay, done. That's a HUGE speedup. -------------------------------------------------------------------------------- ## Wed Jul 20 09:23:44 EDT 2016 ### The Missing Piece is (has always been?) Jupyter Notebook for cutting the Catapult Ropes I'm getting a walking-to-work video out today that I'm very happy with, talking about my latest work including: - The evolution from Tiger to Pipulate to SEO Notebook - Clarifying the motivating forces behind each evolution - Describing how I always come back to using eval - Describing how I combined GoodSheet, Pipulate and SEO Notebook - Describing how I'm importing .ipynb files as if .py modules to do this And THAT is a good video. Wow, can't wait until I weave the embedded video code into this file in the correct locations, chronologically. Wow, big little steps. Winding the catapult... but FOR REAL, this time. The missing piece is (has always been?) Jupyter Notebook -------------------------------------------------------------------------------- ## Tue Jul 19 09:30:18 EDT 2016 ### Wow, Importing .ipynb files as modules... big move forward Wow! It seems like a week later. Had a Daddy/Daughter date night. Saw The Secret Lives of Pets at Union Square, and went to Barnes & Noble and got the Lego Arendelle Castle set that she's been asking for for awhile. I like to indulge her Lego propensities. Wow, another trend I totally nailed. Now, if I could only cash in on my precognitions. Sheesh! Okay, I'm sitting on top of a few. And I'm on, what, like iteration quintillion of my generalized system? Shhesh! Okay, now make it popular, your technique, your method. Pipulate is mostly an API, for what is an API but a set of conventions to which you abide so that when you put input into a system, you get output out of that system of the sort you were hoping for and predicted? THAT is the purpose of Pipulate, and I made a killer logo for it, gosh darnit! This thing lives. It lives now through Jupyter Notebook. Be sharp, and don't forget a thing! Do the work that needs to be done in the way it needs to be done, and don't let yourself get derailed by any of the old traps identified from the old patterns. YOU ARE HUMAN, and as a human, you are subject to all human foibles and contain potential for greatness as great as any human, or human-like life-form that has come before you. So, be at least just a little bit great, will you? Don't be a bozo. Don't screw this up like before, and before, and before, and before. Yikes! I'm not good with crisis situations, am I? No, I just survive. I don't tweak for an optimized future much, do I? Well, that changes now. This is me becoming that next level of self-aware, and taking control of the reins. Tue Jul 19 18:27:04 EDT 2016 Wow, just re-implemented Pipulate, essentially. Wow... Jupyter Notebook... wow, importing .ipynb files as modules, the way you would a .py file. Wow. -------------------------------------------------------------------------------- ## Mon Jul 18 09:57:17 EDT 2016 ### Re-Implementing Pipulate-like Functionality Jupyter Notebook Well, there goes another weekend. I'll be seeing Adi very early today -- leaving at 5:30 PM, and this is after getting in late today too. Oh, well I'll make up for it with an extra-productive concentrated day. But don't give yourself a headache again. You really are biting off more than you can chew. But you did that over 10 years ago, when you got married. Maybe you did it the first time you took a girlfriend. I am a boiling cauldron of potential, with only just the final few ingredients missing for me to become a powerful world-changing chemical reaction, of the Jobs or Bezos variety -- ACTUAL creative/original thinkers, and not just the clever algorithmically inspired clever thieves like Gates. It's better to be second and win than to be first and lose -- unless you can pull back into first position by repeating the trick that got you there in the first place -- over and over and over, such as Apple often does. And so, I'm the stew trying to look outside of itself for the ingredients that I need to complete myself. And that takes some stewing over... but at 45 years old, I'm running out of time fast. Is it REALLY just a matter of connecting the right dots correctly? No, you also need to keep yourself a little bit better organized. At least try to be that organized at work. Let it seep back out to your day-to-day life. Lead your home-life by your work-example. That way, you continue earning well, which is always your path to reduced-stress continuity. Don't go broke or lose your earning capacity! I have to pick up right where I was seamlessly. I actually left at a fairly good place on Friday, with processing lists of URLs... no... I didn't get to processing the URLs. I only got to pulling them. What about the update back? Can my list_of_list be updated back? Mon Jul 18 10:46:26 EDT 2016 Use your nickname strategy to always NICKNAME your current work. That could be a big work-flow breakthrough, really. For example, right now I'm wavering a little bit about how to organize my efforts around next steps. I got a list_of_lists out of a GSheet through GSpread, but it's not a cell_range, and so can't be updated back in batch, and that's bad. Select a cell_range in a fashion as similar to a list_of_lists as you can, and inspect it. In fact, inspect the list_of_list to assure yourself it's not actually a cell_range in disguise. I need more "Operation [This]" and "Operation [That's]" in my life. Mon Jul 18 14:16:25 EDT 2016 Okay... time to think about APIs. This is the chance to make BIG moves forward in terms of simplicity and such. Mon Jul 18 14:35:36 EDT 2016 I want to think through this next step as intelligently as I can, because I will have to live through the ramifications for a long time (again). Mon Jul 18 16:56:58 EDT 2016 Okay, time to commit and go meet Adi down in the lobby. I guess it's an okay stopping point. Had 2 last-second requests come in to send stuff out. Racing to do it now. Adi and Rachel waiting downstairs in lobby. Raining outside. -------------------------------------------------------------------------------- ## Fri Jul 15 10:48:14 EDT 2016 ### Code Cleanup Day Okay... today. Today is a gift. I have a few questions to answer. Do those quick and do those first. Leverage Jupyter Notebook where and when you can, making EVERYTHING FUN! I can't imagine I'm being paid to do this sort of stuff. So make the daily tedium tasks JUST AS FUN! Fri Jul 15 11:56:10 EDT 2016 One of the most important projects lurking about is to do lookups against an explicitly provided set of URLs. Let's do all the standard GA, SC and now even PostgreSQL lookups against these URLs, getting back all sorts of datapoints and statistics to inform our decisions. Don't let that be a capability blind-spot anymore. This is the most important thing you can do. And start to knit in a whole bunch of other things too, such as: - Retrieving the full-HTML "snapshot" to field-stuff store in spreadsheets (of smallish numbers of rows!) - Doing a target keyword (multi-word terms) analysis of those pages - Simple title-tag retrieval - Something about the link-graph - SEO difficulty scores, from such APIs as SEMRush (and KeywordFinder?) - Something against the Twitter API -- perhaps, the most re-tweeted hash-tags on related topics - Something Machine Learning (first baby-steps) Okay, don't get too ambitious right out of the gate. Let one thing build upon the next upon the next, and make each thing wonderfully Pythonic and a joy to talk about and demonstrate in its own right. Finish what you started with GoodSheet, which is exactly what you need to do in the first place. Should I rename it HappySheet? GoogSheet? The transition from GoodSheet --> GoogSheet... implies a simple and subtle, yet profoundly awesome animation! Fri Jul 15 14:07:50 EDT 2016 Can't believe how fast the day is flying past. Awesome discussion with Marat. I like having a counterpart on the other side of the country, at IGN. I'll have a friend in San Francisco to visit one day with Adi. But now, I need to get to work and knock off a deliverable and a permanent capability expansion. -------------------------------------------------------------------------------- ## Thu Jul 14 08:15:55 EDT 2016 ### High Pressure Day, but Turned Out Okay Merging getting easier. Of course, I'm just coordinating my daily journal writing with myself, so it's not so challenging. The fact it's reverse chronological doesn't hurt either. Now to work. Uploading a video that I'm not sure I'm 100% happy with, but it is sufficiently vague enough I think to be fine. I am not a Data Scientist. I am a Data Artist, I believe. But an Applied Data Artist -- and not just for visualizations, either. It's when the shaped data has to get fed directly back into the system as input to help create a positive feedback loop. I think I forgot to put the work I did with Search Console while at home into the private menu repo, but that's okay, because I have the data file. Use all your safety nets, and don't lose focus of the prize. The prize has to be front-loaded with 80/20-Rule easy-picking's work this morning. No chasing rabbits! I did that enough for this project -- BUT for the better, I think. 1, 2, 3... 1? Oops, I take all that back. I just had to stash my local changes before a git pull. I have the new searchconsole.ipynb file -- which is awesome in its own right, and should be the subject of a video, BUT I need to get to work USING the data right away. My machine started overnight. Make sure the reports run... okay, done. Even now, faux pas avoidance. But drill on. Next? Oh yeah, SQL Workbench/J, which I incidentally got working on my desktop Mac at home, connecting to the corporate database with OS X's built-in VPN client capability -- woot! It was VERY helpful. Pure SQL is sometimes nice too (in addition to Jupyter Notebook and Python). Okay, I blended the hand-picked items from the striking distance report that I drew out of the "not provided" still provided keyword data, which gets around the Search Console pre-processing... nice. Will be noteworthy later, don't forget. Next? Extended the conditional color coding for VERY easy explanation. Now, I need to be able to pull in that data -- maybe as one giant data object in Jupyter Notebook using GSpread's dump-it to a list-of-lists trick. Yeah, let's do that RIGHT AWAY. No time to deliberate. Thu Jul 14 09:32:12 EDT 2016 Critical juncture time! Focus & do. Walk the OrderedDict -- something I'm already doing to get the node count. Inspect that bit of code. Report generating failed. Manually setting virtualenv and running from shell. Nice to notice not noticing Dropbox notices popping up on all my various machines from which I stay logged into that same Dropbox. Okay, pshwew, faux pax avoided, round 2! Now, for avoiding the biggest bozo faux pas of all. Scalpel cuts... Chisel strikes... Hand of the surgeon, hand of the artist. Go! Thu Jul 14 10:56:18 EDT 2016 Okay, I just did some housekeeping, like touching base with the boss, catching him up with progress and the plan, and sending out a meeting invite to the menu stakeholder, before she's out for over a week. This has to be an impressive "selling" job, and for that, it just has too look and feel and work so right. You have just one more bit of inspired break-through to do. I just created an OrderedDict nested-walk copier... woot! That's a huge piece out of the way. Now, I have to report on some sort of position as I go, so that I know when and where to insert in new nodes. Think! 1, 2, 3... 1? Identify leaf nodes, and put a self-targeting hash link, just to see a href's start to appear, and call special attention to leaf nodes... Okay, done. Brain buster, that one! Committed to Github, and re-executing from beginning. Take a deep breath... next? I could speed this up a lot not hitting against SQL every time I clear and re-run. Thu Jul 14 13:34:35 EDT 2016 Okay, I got caught in a rabbit hole at the worst possible time, and it's already 1:30. I really need to do this all in the next hour, in the way of making it all look and feel very real. Don't screw around. Time is up, and you have some major thought-shifting to do. You HAVE to make the hyperlinks on these things. Do that immediately! Thu Jul 14 14:35:35 EDT 2016 Wow, some of the dots are starting to connect! Less than a half-hour until the latest reasonable time I can show my boss BEFORE preview to stakeholder. Time is just about out. You have to make a few very special chisel strikes right now, so as to not look totally like a bozo. 1, 2, 3... 1? Nailed it. -------------------------------------------------------------------------------- ## Thu Jul 14 20:52:52 EDT 2016 ### Take That, You Nothing Better to Do-nicks Wow, I can't believe I didn't commit and push today's work journal entries that were actually from work. Well, tomorrow's merge will eventually put that here in correct chronological order. It's funny to me that some of you freaks will be reading this here first, before today's earlier writing is merged into this Github repo tomorrow, and thereby published, automatically by virtue of the amazing Github.io publishing system built into the already quite amazing Github system. Let me tell you folks, this is all connecting together into something special, and I'm here to wire up a few neurons through synapses or whatever singing doo wa diddy diddy dum diddy doo... I like to type, I like to rhyme, I like to do it all the time, singing a vi vim nvi emacs, only if you're genuine. You follow fad. You do it all the time. But I do to, but in I chime. The Raspberry Pi did it for me, just like the old Amiga did in it's day. I'm genuine, I think you'll find, quite second tier or maybe third, with even the great geniuses only riffing on other people's ideas, even if they do vastly improve the very thing they satire. This is why the great text editor wars of yor were so, and still continue to be to this day, one of the great genuine measures of your creed on this day. Both vi, particularly in its modernized forms such as vim, which come with syntax color-coding and other amenities many feel necessary in a modern environment, or emacs which never needs to be modernized, because it was born perfect an all LISPy, evil mode clinches it. emacs wins, if you can stand the Carpal Tunnel Syndrome. We in the church of vivivi feel it is best to not bother with such arguments, because that time is better spent being at one with your text file, navigating our way around that strange always-there, always really running things humble text-file, the source of all code. And is it really even a file? Hmmm. Think about what a computer operating system really is, and what its largest roles and responsibilities are. If graphics shells including the desktop and the browser and any local program user interface is supposed to be operating system independent, so that the code can run anywhere, then what friggin' role does an OS even have these days? If everything else is delegated out to highly abstracted interchange parallel-processing optimized components, at the end of the day the OS really only has to get the silly machine running in the first place, and handle all the messy local physical reality stuff of like reading information from disks and memory and writing it back into disks and memory. And why bother to blab about text wars? If time is so precious, then why are you spending your time reading this? And why am I spending my time writing this? I mean like, it's not a book or anything... is it? Or is it just performance art, because just as I like to talk, I like to type. We Enjoy Typing... WET. Some like it DRY. I say, some like it DRY. I say, some like it DRY. I like it WET. I like it as WET as you can let this all-WET jest get. 'Cause it jus' keeps getting WETter and WETter, 'round 'ere, cause WETer or not you get it, it gets WETer, I say... much, much WETter. Fine, now how was that? Pointless enough. But I warn you, before you criticize, you're going to have to admit that you're one of my readers -- especially if you comment on this one very particularly very very meta statement right here. You know, the one about meta statements. I think I've been watching too much Rick and Mortey, to tell you the truth. I like Rick Sanchez. I like the story-lines. I like that Adi is going to grow up knowing all these popular SciFi tropes from out of the gate. I am a geek. I am most certainly a nerdy, Python-loving, SEO-evolving, Awesome Daddy'ing due who doesn't shave and keeps my hair in a pony tail, because it's all the more time for coding and kitties... and a little bit of writing, that you're going to have to admit you actually read if you actually try taking pokes at me. Here's this guy... what wait, you read him... ? What, you mean you read him too? I mean, how could you? He just writes so much, and it's all so boooring! I mean especially like this part right here. Commit and push? Yeah, commit and push. -------------------------------------------------------------------------------- ## Wed Jul 13 08:59:41 EDT 2016 ### Today, Don't Screw Up! Okay, today? Don't play. Display competence. Be transparent. Be communicative. Don't let your thought process wander off. Fall in love with those aspects of today's work that must transmit that love-worthy feeling to others. And so... and so... 1, 2, 3... 1? Definitely getting the Menu object that I'm programatically creating to display as an actual expandable/collapsible tree user interface component... in Jupyter Notebook? Hmmm. That would be the ideal situation. Research! -------------------------------------------------------------------------------- ## Wed Jul 13 21:05:51 EDT 2016 ### Seat of my pants Expanding your reach. Exponential growth. The Kurzweil interview with Neil Degrasse Tyson. Wow, is my thinking in sync with his. Yup, we are dealing with the sub-functions of neurons... of course. We're ages away from a human brain equivalent computer walking around in a robot, but we're already there for simple tasks like keeping it balanced. So, insect and lower-form intelligence will come first, and machines will develop animal-like instincts and intuitions. The reptile brain won't be far behind, then emotions and all that spiritual mumbo jumbo. I think I'll like the reptilebots more. Wed Jul 13 21:54:07 EDT 2016 I think my headache today was from pushing myself too hard on too many fronts, and feeling stressed from too many fronts. And with good reason. New job that I'm determined to make successful and long-term, coupled with wanting to be the most awesome dad for my daughter, even as the marriage falls apart. I saw a crazy person on the subway this morning, and distinctly thought to myself, that's what happens when someone can't hold it together, and how truly close we call could be to that state. It's just a matter of slipping a little bit... and slipping a little bit more... and more... and more. And so... and so... I have my time now, and I will not squander it. Nor, will I sacrifice my health. I will strike the perfect balance and compromise. Wed Jul 13 22:49:13 EDT 2016 I have to remove all slight resistance barriers that make distraction more appealing than focus. I guess I'm always focusing... I'm just focusing on the wrong things. Thu Jul 14 06:43:45 EDT 2016 Heading to the office now. This is not a new journal entry, because I have not slept. Iterating closer, but this is not moving in leaps. Hail mary this AM. Design the rest in your head on the way in. Look at the code. Just take pics with your phone of your desktop screen. Don't try to code on the subway. Meditate and contemplate and hit the office with rapid-to-implement ideas ready to go. >>>>>>> 251c02922fe601bc53eb2be62400715bf56cc83d -------------------------------------------------------------------------------- ## Tue Jul 12 20:24:53 EDT 2016 ### What Matter I am still loving the Culture series. Currently listening to "Matter", which the online series says is not a Culture series itself, but I think that's wrong. Set in the Culture world is set in the Culture world. I'm listening to it on Audible, which I paid a ridiculous $14 or something when I bought the book-form for my Kindle. Salvation is in Statistics. Tue Jul 12 20:36:05 EDT 2016 Just called Adi to say goodnight. -------------------------------------------------------------------------------- ## Tue Jul 12 11:33:42 EDT 2016 ### Reading and Writing a Million Lines in Python Wow, didn't get the journal going until 11:30 today. Looked like an all-clear on meetings, but actually just got invited in. Keep on top of the topic, but start researching how to go from tablular data to more menu-like nested data. I'm guessing Stack Overflow will be my friend. I like the pattern that's getting created with this new walking, Jupyter Notebook video rhythm. Each day, I could do: - One walking-talk video that sets the state for the coming tutorial - One talking-head video that walks through a tutorial on that topic This has a good "get it out of my system" dynamic, building-up the start of the day as something I very much look forward to. I can use my time walking to the subway and even on the subway to plan the video I'll be shooting walking from the subway to work... and then hit the ground running when I get into the office. I love this approach. It fits into my natural rhythm, and always puts something exciting front-and-center in my mind. Today, it was generator expressions, which are indeed very exciting. I needed to make a file with a million lines: # Write a million lines in Python with open('data.txt', 'w') as f: for i in range(1000000): f.write("%s\n" % i) # Read a million lines in Python, technique #1" with open('data.txt') as file: for i in file: pass print('done') # Read a million lines in Python, technique #2 for line in (x for x in open('data.txt')): pass print('done') Okay, time to finally commit this journal entry. Tue Jul 12 18:38:54 EDT 2016 Heading out for the day. Got the really tricky stuff done. Wow, this menu project could end up being a really big win. -------------------------------------------------------------------------------- ## Mon Jul 11 09:16:49 EDT 2016 ### Have To Work More Effectively! Back in the office again. Felt like a very short weekend with Adi. Gotta go see The Secret Lives of Pets movie with Adi. The SEO Pulse Reports csv files still generating flawlessly from my laptop in a headless virtual machine. Mon Jul 11 10:05:44 EDT 2016 Okay, got my Weekly SEO report out. Feels good to get that out inside the first hour of Monday morning. Good process, and the best flow of regular updates I probably ever did for a boss. This may be the first boss I've ever had who had a really strong vision of what he is doing and is actually pushing ME forward, instead of just hoping I do the right things for the right reason because I'm so far ahead of the bosses (all prior employment experiences, with the possible exception of Acronym). And I like it. I may be even more effective being used as a directed tool, so long as I keep my intelligence firing on all cylinders. Don't let myself "go dumb" by following other peoples orders. Solve the puzzle that exists before you at this moment. Take this privileged opportunity to create Pipulate 2 (3?) WITHOUT developing it. Hmmmm. Just live inside Jupyter Notebook, and solve that original API problem. Mon Jul 11 17:23:57 EDT 2016 Day almost over. Ugh! Where did it go? And I'm meeting Adi in the city afterwards, so I'm going to have to jump out of here at 6:00 PM sharp, and I am not as far along on this Menu thing as I need to be. Shit, okay, don't get stressed. You have to knock this thing out of the ballpark. -------------------------------------------------------------------------------- ## Sat, Jul 9, 2016 9:14:16 PM ### Bonding With The Kid Over Ice Cream and Chowder Enjoying Karamel Sutra Ben & Jerry's with Adi while watching the Thrice Cream episode of Chowder. Just installed Anaconda on my Surface Pro. Now to run it first from the icon they put on the start menu. -------------------------------------------------------------------------------- ## Sat, Jul 9, 2016 8:16:37 PM ### Homesteading in the Noosphere, I am... Hmmmm (in the voice of Yoda) Getting close to her getting tired. Second meal. Same as the first, re-heated, and hoping she actually eats something. We woke up at I guess somewhere after 9:30 AM. Experimenting with fairly different "laptop" arrangements that the MSP4 suggests, like laid-flat against your whole-lap as if a laptop were opened well past it's clamshell norm to lay flat on both backs. Adi's on my phone with her beloved Baby Panda series, this time Animated Stickers. I definitely want to do my Advanced Python article now. I have a lot of good stuff accumulating. I will be data-mining THIS page, essentially to "extract" articles and ideas and projects. It will also serve as proof when I have ideas first in that good ol' noosphere. Homesteading in the noosphere, I am... hmmmm -------------------------------------------------------------------------------- ## Sat, Jul 9, 2016 1:33:57 PM ### Python, Robots, Kids & The Future Every time I go to type a word or two, Adi acts up. I have to keep in mind that this sort of activity, Adi deems as competition with her for my time. Little does she know I'm thinking about her constantly, even as I type about work and my job. My mission I think is to help kids grow up with Python. Step #1: Build up the mystique of doing coding... robots, magic spells. Let her know there's something special about this skill that has to do with basic reading, writing and math skills for the future. Maybe even more important, because it ties together reading, writing and math in a crescendo of machine automation. We humans need a little help. It's up to us to create our helpers, to help eradicate suffering and unhappiness. Thanks to robots and machines, everybody should be living like a heaven on earth -- even if that makes us deliberately keep systems in place to challenge ourselves enough to keep us at least that strong... maybe even stronger. Just as digital photography and video have improved our human memories, so will digital learning improve our reasoning abilities. We will be able to look at and question our world as if it were data to be queried. Enough data has accumulated up in enough ways, and been machine-analyzed in enough standard pre-processed ways that answering whatever likely questions come up will be quick and easy. The question language parser will see to one part of that challenge. It's local intelligence about how to interact with the global query language takes care of the next part. A back-and-forth ensues, a constant exchange of incrementally more information and alternate query drill-down possibilities. It's like command-line completion with the tab-key, but for all things in life. Life's IDE. We are all developers, basically developing ourselves and our children and perchance, our society and culture. Sat, Jul 9, 2016 3:41:43 PM Operating systems and graphical user interfaces are the wrong place to concentrate when it comes to important long-term skills. Making killer apps for a particular popular hardware platform is like competing for the big-bucks of pop-culture hits, and there's really nothing wrong with that. The world needs its Instagram and Angry Birds billionaires. But that's not the same thing as day-to-day obsolescence-resistant technical competence -- a much less sexy proposition, but a good and important starting point and baseline level of tech skill none-the-less. When you like to type (and like to think), entries like this from time to time throughout the day is no biggie. The trick is to feel comfortable on almost any hardware, anywhere and under almost any conditions. It's not just about one thing -- like Python. Even Python itself is so many different things. When I say Linux, Python, vim and git, I'm actually talking about so truly many things. Just talking about those... plus, occasionally JavaScript, RegEx and other supporting technologies... will be plenty of material to make an identity and a specialty and an income out of. -------------------------------------------------------------------------------- ## Sat, Jul 9, 2016 12:00:34 PM ### Become Invaluable Then Re-write The Rules Wow, it's been a long time since I've used my Microsoft Surface Pro 4 for journal entry. Here's my git pull into this directory: Warning: Permanently added the RSA host key for IP address 'xxx.xx.xxx.xxx' to the list of known hosts. remote: Counting objects: 151, done. remote: Compressing objects: 100% (37/37), done. remote: Total 151 (delta 61), reused 39 (delta 39), pack-reused 75 Receiving objects: 100% (151/151), 942.62 KiB | 171.00 KiB/s, done. Resolving deltas: 100% (97/97), completed with 2 local objects. From github.com:miklevin/miklevin.github.io 1aae1b2..daad127 master -> origin/master Updating 1aae1b2..daad127 Fast-forward index.html | 2249 +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++- journalmagic.ipynb | 46 +- 2 files changed, 2277 insertions(+), 18 deletions(-) 2277 lines inserted since my last time on this machine, and I see that indents are still set at 2-spaces (pre my PEP8 compliance days), so I need to fix that. Okay, fixed. Adi opened all but four of her new Yo Kai Watch series 2 mystery packs. She wants to stay inside today because of the black on her lips from the dentist. She's having two cavities filled. We have to be better about getting her to brush her back teeth. She's good raw material, but habits need to be formed through repetition. But if you lack buy-in that the thing you're trying to institutionalize into your life (brushing, throwing out trash, etc.) is fundamentally wrong and objectionable and you have a fight every time, then there's serious issues about long-term healthy adult-hood. One simply needs to brush and get rid of one's own trash. Without doing so, you will always have problems, and being thrusting needless problems onto the people around you, causing constant stress. I need to use things like the Yo Kai metals as rewards for her doing stuff. At least I got her to throw out all the wrappers and packaging of the metals in order to get the last (but for the 4 I set aside for Pop Pop) pack. That worked well, as did working WITH Adi to wrap her beloved loose coins all over my bedroom floor into coin wrappers. Those coins are staying that way now. I need to think more and plan more. I'm still barely keeping my head above the surface. I need to make the time that I AM and that I DO spend on things more effective. And then I need to fill in the scraps of time here and there with real chisel-strike type of progress. Chip away, chip away, here and there, with whatever little bits of time you can cobble together. My work-schedule is intense: 9 to 6 with a 1-hour commute on either end, making it 8 to 7. That's 11 hours a day for an 8-hour job. Hmmm. Become invaluable and help rewrite the rules. -------------------------------------------------------------------------------- ## Fri Jul 8 14:36:18 EDT 2016 ### Another Tutorial (still for co-workers) Okay, getting later on Friday. My 2 earlier videos were really solid, and I feel the itch to push out one more on general list and dictionary issues. What I think I'll be doing is creating the dictionary I need to convert column numerical indexes to their Excel spreadsheet-style letters (columns Z, AA, BB, etc.). It's something I've been hardwiring for awhile in the GSpread locations where I needed such an index, but I think I'll do it more eloquently for a tutorial. -------------------------------------------------------------------------------- ## Fri Jul 8 08:45:40 EDT 2016 ### Rebooting Learn Python Endeavor for Co-Workers Every day at work now is part of that journey. 1. Install Anaconda 2. Start Jupyter Notebook 3. Don't have to learn vim Okay, I shot and published 2 videos today, by 10:00 AM. I produce more quality online content by 10:00 AM than most people do in a lifetime -- haha! I think it's truly forgoing the editing step that lets me keep up this pace. One of the two videos is great to help Marat and others get started with Anaconda, Jupyter Notebook and Python. This next phase of my online presence is all about accessibility, and the ability to carry out amazing feats you would have never previously thought yourself capable of. I get that feeling with Python almost every day now. I'm currently exporting the video from ScreenFlow (old, but sufficient version) to a file-format that I can upload easily to YouTube. I'm also trying to make a point to share my videos out there pretty regularly over my other channels, such as Twitter, LinkedIn, Facebook and Google Plus. Fri Jul 8 10:21:21 EDT 2016 Okay, I should really think through how to make today super-effective. How to minimize distractions, work on the right projects for the right reasons, and make good progress. Fri Jul 8 11:23:17 EDT 2016 Okay, I'm getting interested in the mitmproxy package on Python. It's for sitting on a proxy and inspecting the http communication, and optionally modifying it. It's the type of thing I need to do on my next step to start really understanding all the http traffic going through a site and doing some next generation SEO projects, where I automatically create 301 redirect maps keyed off of actual site traffic correlated with Search Console URL-to-keyword search mappings. But focus for now. Get the Marat request done. He gave you a list of URLs to look at and process for the bad bounce report, and he'll be into the office soon. It's a small list of URLs being pulled, compared to how many are being pulled in the greater report. Okay, I'm running the Bad Bounce Report on Windows. I had to add Google Analytics scope to the GoodSheet repo. Oops. Okay, and scipy had to be installed with conda on Windows. Interesting. I should take it out of the pip freeze requirements.txt and add it to the stuff to do with conda. -------------------------------------------------------------------------------- ## Thu Jul 7 21:46:29 EDT 2016 ### They're "virtual screens", not "virtual desktops" - Git will make you scratch your head and nod appreciatively. Pshwew! - vim will actually re-wire your brain, maybe for the better. Jury's still out. - Python will astound and amaze you with it's ability to let you astound and amaze other people who are not themselves actually onto the Python trick yet. - Linux will confuse and infuriate you, until you get into the zen of short, piped commands. There's a whole host of supporting characters here, because no matter what I like to think, these 4 tools (tool-sets) and the relationships and work-flows they imply and enable are not the be-all, end-all of tech tools. Even if you love them all 95% positive, such as I do, the workflow can still itself be improved, and the polish, stability, support, blah blah blah of other successful tools is never to be ignored. The old trick is to just write Python API-mappings around it, with some sort of wrapper. Sometimes the hooks are coded into both respective systems, and sometimes Python simulates a user in a Python-controlled shell terminal, with tools like Pexpect. Point being, one way or another, Python's going to be interacting with it -- no matter what the "it" is. So, some of Python's supporting cast are: - Unix-like OSes. Gotta love the reliability of host platform conventions. - Regular Expressions. The time always comes when you want a powerful finder. - JavaScript. Billions of JavaScript-capable Web browsing devices can't be wrong. Another sub-language like RegEx, but for UI implementations. - BusyBox Linux. No matter the device, pop this on and things are familiar. - vi(m) text editor. Learn it. Tolerate it. Let it grow on you. Mastery. - Jupyter Notebook. Look what happens when Scientists got a hold of Python. - Anaconda. We can't always be running our code on servers all the time. - pip/virtualenv. If it's a dedicated server, you'll only need pip. - Markdown. The directly readable, easily type-able style I'm using now. - OS X's (soon, MacOS's) elegant full-screen, 3-finger swoosh best thing to happen to productivity since the Amiga's virtual screen system. People have to learn to stop saying virtual desktops. Many of these things I put into full-screen mode, like the Unix Terminal window I'm using to type into vim right now is full-screen with no desktop wigetry to be seen. It's a pure, undecorated full-screen immersive, flow-conducive setting. I would have never expected it from Apple, but I think the highly restrictive mobile platform challenges have illuminated for everyone those things that are so horribly broken about the "desktop" paradigm of software user interfaces on the desktop. Point is, they suck. I never liked Workbench on the Amiga much. I was an AmigaDOS person myself, with my own distro called ARPnCrunch -- for the AmigaDOS Replacement Project plus the Crunch binary compressor that squeezed the last few megabytes of space on that 800K double-sided (but not high-density) floppy disks. Workbench sucked. Windows sucked. All Mac OSes sucked. Only full-screen Amiga programs (they'd be called Apps today) really rocked. And today's Macs are rivaling it. The thing made virtual is screens -- not desktops. ------------------------------------------------------------------------------- ## Thu Jul 7 21:23:20 EDT 2016 ### text editor wars still being a thing says something It's becoming an ever-increasing comfort to be typing into this journal. I don't care much for impressing everyone with brilliant, well-planned out articles, and refined and edited results. No, instead I'm mostly just blogging (and now, vlogging a lot too) for myself, to sort of help document and process my thoughts. It amuses me that this attracts to me one or two crazy into me fans at any given moment, and sometimes there's an emphasis on crazy. Some are crazy-awesome, like Paul, while others are just plain crazy, like he who shall not be acknowledged. Wanna know someone? Look at what they find fault with in others. This reveals precisely their own deepest insecurities in themselves, for it is what both occurs to them first and annoys them most. They think about it a lot, maybe suspecting that same thing about themselves, but are too scared to do the deep introspection to diagnose and fix the flaw. Instead, they find fault in others, and sometimes even elevate it to a sport. The more one complains, the sharper their image becomes in the mirror they summon. Eloquence? Nah. Tired, and feeling the stuff flow out of my fingers, because we enjoy typing. We feel vim is one of the greatest idea processing environments ever invented, and the fact that the text editor wars is still a thing after all these years, is a testament to the battle-hardened, ultimately adaptable, and ultimate still relevant and applicable text editing tools to many a problem domain. Text editors like TextMate, Submlime, Notepad++, Edit Plus, Cygnus Edit (CED), and the like all run their course and recede into the dark corners of shrinking user-bases and no reason to take it up anew. Dead platforms, abandoned code-bases -- EVEN IF contributed to the public domain, with no emotionally invested project leads. That leads to things like PFE (programmer's file editor) which is a competent editor that went as far as macros, but not as far as syntactic color-coding, that enjoyed it's era somewhere in the 90's. But against this backdrop, there was forever vi and emacs -- relevant skills on always there and supported quirky and idiosyncratic, but still amazingly useful tools. I'm going to try to leave work early tomorrow so I can bring Adi to Cony Island to see the 9:30 PM fire-works, after her dentist appointment. I think she'll need a little cheering up. Ugh, I don't want to shuttle back-and-forth to Inwood just to get the car to pick her up and drive her to Cony Island. I'm going to drive my car to work tomorrow, and get cheap early-bird parking. Got some googling to do before I go to bed tonight, so I know what time I'll have to wake up tomorrow morning. At least we'll be spending the weekend here again. I feel like I made such progress last long holiday weekend, and that I can keep making progress with the weekend's fresh energy that's not getting used up at work and being all professional and Python today. Today, I made a YouTube video worth mentioning. Wow, is processing my journal going to be an interesting exercise to show off Jupyter Notebook with... wow! Pipulate 2 is going to just be with Jupyter Notebook playing the UI-role of Google Spreadsheet, but with an enormous amount of additional power with that superb IDE-like Python code execution environment. -------------------------------------------------------------------------------- ## Thu Jul 7 14:49:52 EDT 2016 ### conda virtualenv for a newb? Yeah, probably Wow, so distracted today. Already almost 3:00 PM! Can hardly believe it. And I want to leave early tomorrow. And I have a project still to do for Marat... let me slam that out. We will also likely be taking up some Python tutorial stuff again. Today has actually been a most useful day, if you consider promoting Python awareness (and SEO-factor awareness / separate topic/email) as being useful. Yes, yes. I do. Momentum. Build success upon success upon success. Okay, I want Marat to be able to run the Bad Bounce report himself, but with the SQL modifications that are necessary for this particular request. So, run it on your own machine right now... Thu Jul 7 16:31:39 EDT 2016 Okay, did a great clean-up job on The Bad Bounce Report. Now, extract it from the "core" repo, and give it its own identity. This is also a great opportunity to document the process EXACTLY for Marat by making a new conda virtual environment for it. Actually, I'm documenting it directly in the Jupyter Notebook and using pip, but for documentation's sake, these are the non-conda dependencies: pip install gspread pip install httplib2 pip install google-api-python-client pip install sklearn (pip install scipy) -------------------------------------------------------------------------------- ## Thu Jul 7 09:32:12 EDT 2016 ### Another Entry For The Books I want an @t macro in vim to force me to look-at and use my to-do list more. Whenever I want to RECORD a to-do item, I'd type @t and it'd set me up to start typing a new one. But not right now. Think through the experience you want to have with Marat this morning. Document in a 1, 2, 3 step procedure. Walk him through it. 1. I'm NOT going to show him conda virtual environments right away 2. The directory you're in when you start jupyter notebook becomes "home" 3. Files are easy to open when they're sitting in your home folder 4. Other folders sitting "next to" the one you started Jupyter Notebook from will be important to (have to be sitting next-to "home" for relative paths) I'll assume Marat is on a Mac, and I'll assume Anaconda is installed. What is the BEST next experiences to have? Hmmm. An introduction to Jupiter Notebook for the SEO? # On Becoming More Technical, from SEO to Robots ## An Introduction to Jupyter Notebook for the SEO ### Why Coding? Why Python? Why Jupyter Notebook? Coding happens. If you think you can get away from it, you're kidding yourself. So the main question is, if you're going to be forced into coding at some point in your life, you might as well get into the mindset to maybe love it -- because you might. I'd go so far as to say you should deliberately seek a path that yields enough up-front benefits to invigorate and energize you, but then just never stops delivering or lets you down. ### From Abstract Notions to Real Things Of course, I rig this "why coding" argument to be biased towards Python... because I love it. Your effectiveness in many branches of today and tomorrow's professional life will be determined by how well you can speak with machines, and after a quarter-century of experimenting of different ways of talking with machines, I finally found my way to one that feels natural and fits like an old, comfortable piece of clothing. Believe it or not, after awhile, Pythonic thoughts will flow from your mind as naturally as speaking in a native spoken language, such as English. And this is when the advantages of Python REALLY start to kick in, because you'll find yourself able to go from barely tangible abstract notions floating around in your head, into actual working instances of that idea manifest in the real-world. ### Programming Languages are Just More Software Coding is just using a software product, just like any other software product, except that you have more granular control over what's going on. That makes it a little bit harder, but ultimately more advantageous. So, a programming language PLUS their corresponding execution environments constitute a software product, just like Microsoft Word or Adobe Photoshop. They are all creations for creating creations; systems for building systems, or what have you. Great creative works can be made in each. ### Don't Give Up on The Dip Seth Godin pointed it out in his tiny little book, The Dip, in which he expresses that statistically, the vast majority of people give up on some tantalizing new endeavor the moment things get to be just a little bit more than comfortably difficult. Once people are REALLY challenging themselves beyond the comfort-level, they bail. Don't. The lower the learning-curve for a tool and higher the initial-reward for using it, the more people will flock onto it, thereby lowering any "specialness" you once felt by being on the excellent, but now mainstreamed and undifferentiated tool. This is the cycle by which the cool and cutting-edge becomes the passé and boring old stuff our parents used. One of the main purposes almost genetically coded into the next generation is to change the game on the older generation, rendering them ineffectual and powerless. Of course, the older generation's game is tricks to horde and fortify long-won power, so the young upstarts can't pull the rug out from under them. As a result, Unix is difficult... just difficult enough to keep out the uninitiated, but so powerful as to completely disrupt the old-school alternative world that was figuring out how to timeshare-meter-and-monetize every processing cycle of every computer in the world, via Multics. We are lucky Ken Thompson invented Unix. Oh, pooey! Fortunately for us, there is a whole spectrum of software-difficulty and learning-curve steepness. We don't have to take the most difficult path, nor the steepest learning curve in order to yield fabulous and differentiated reward. No, in fact just about every provider of any sort of service out there that humans can use directly through Web (or other) user interfaces are scrambling (or have long-ago already done so) to provide programmatic user interfaces to their product. But it's not "users" as such that's using these interfaces, it's machines. But the term that stuck is "application". Therefore, these programming interfaces for applications are known as application programming interfaces, or APIs. And every API-provider at very least makes their services accessible via the same communication system that powers the Web called TCP/IP. In other words, some sort of "request" is made over a TCP/IP network to the service provider and some sort of "response" is sent back. The Web is built on top of that with a subsequent protocol called http. Not all request/responses use http, but a lot do, and it's often used for Web-accessible APIs. But you're doing your programming on your local machine, or perhaps a virtual computer in the cloud such as Amazon or Rackspace that you log into, and NOT on the actual computers of the service providers. So you need a "local" location to run your code, and in this fact resides most of the persistent and invisible obstacles to people getting into coding and staying into coding for the long-haul. There's probably a thousand little reasons an individual's endeavors to "get more technical" break down at this step, and most of this book will be dedicated to helping you, the journeyer, overcome these obstacles -- perchance, to fall in love with some of the less-known, but transformative tech involved in leaping over these hurdles. Without being rushed, how would I do this all with Marat? Hmmm. Okay. 1, 2, 3... 1? - SQL Workbench/J? Nah. More like: Let's pick up where we left off. First it's going to be a crash course in Python. Here's the few key things you need to know. I'm going to walk you through some fundamentals first, so that there is no mystery. It's still going to be full of all sorts of mysterious and wonderful things, but we're going to purge out the mysteries that matter, and put what's going on here more under your deliberate control. Open a terminal and type: conda create -n py35 python=3.5 jupyter cd ~/ mkdir badbounce cd badbounce source activate py35 ### Crash Course in Python - Here's a list - Here's stepping through a list with a for-loop - Here's a dict - Here's stepping through a dict with a for-loop - Whitespace matters... Surprise! - For index, item in enumerate(somelist) - For key, value in somedict.items() ### Opening Files to Read & Write - with open('filein.txt', 'r') as acursor: - acursor.read() vs. acursor.readline() - with open('fileout.txt', 'w') as acursor: - acursor.write('blah') - No need for "close" and why... Pythonic thinking -------------------------------------------------------------------------------- ## Thu Jul 7 09:17:30 EDT 2016 ### Going to get coffee Wow, git pull is now engrained as a morning ritual -- and one of the first I feel compelled to do when I get into the office. It goes something like: - Sit down, open Macbook Air which starts more instantly than my laptop - 3-Finger swoosh to my full-screen vim daily journal instance - Keystrokes: Esc :w Enter :sh Enter "git pull"... merge or whatever - Look to see how my Windows login is coming along - Open a dropbox to check the Date modified date on the daily report CSV files - Admire how smoothly that process is still going running off a VM on my laptop - Make a few morning notes (this) - Glance at email, calendar and my Boggie Board (what's most important/urgent?) - Lacking anything urgent, go get coffee Oh, before I get coffee, I had to restart my Windows laptop yesterday, because of McAfee endpoint software, and so I can document my getting settled-in process on that machine a bit. - VirtuaWin on Windows 7 gives me a pretty good virtual screen environment - Switch to screen 5 and start VirtualBox - Start my Kubuntu virtual machine in headless mode on VirtualBox. Close interface. - Go back to screen 1 and start Outlook 2010. Wait for into to appear before switching to screen 2, or it gets wonky. - Switch to screen 2, start Google Chrome. - Switch to screen 3 and open a Windows CMD (command) window. - cd into my cygwin home directory and into whatever repo I'm currently working on there - activate journal and start jupyter notebook / watch for the Home tab to appear in Chrome - Switch to screen 4 and open a cygwin MinTTY shell. cd into same repo. - git pull - Go to screen 5 and open SQL Workbench/J And one final note before coffee. I have an 11:00 AM meeting (8:00 AM PST!) with Marat to go over using Jupyter Notebook for data pulls! Ugh, okay. Let me think. Get that coffee to be able to think. Then, go through the Mac dependencies for the types of Amazon Redshift and Google Analytics & Search Console data pulls that I'm doing so often these days. -------------------------------------------------------------------------------- ## Wed Jul 6 13:23:53 EDT 2016 ### Menu-work... The database edition It's time to do round-2 now on the menu work, in the method in which you started it today. Okay, dependency time again on my laptop. Wed Jul 6 14:55:10 EDT 2016 Ugh! I managed to install psycopg2 on my Windows 7 64-bit machine with a .exe installer, but then I had to manually copy the files over to the virtual environment. Yuck! But it works. My Windows 7 machine is now one of my primary Jupyter Notebook development machines. I'm glad Marat is on Mac. On Windows, even Anaconda is no assurance that you won't be facing dependency hell. Amazing. -------------------------------------------------------------------------------- ## Wed Jul 6 06:58:31 EDT 2016 ### Getting Ready for 11:30 AM Meeting Okay... not even 7:00 AM. Purify your mind and focus on the work. Oh! Do my actual joins. 1, 2, 3... 1? Lots of false starts on that one, lately, hahaha! Okay. Step 1 is to get menus.ipynb open, and to copy all the CSVs I generated on Thursday into there. Get these joins done. Get the Bad Bounce Report loaded in Jupyter Notebook along with menus.ipynb. Okay. Now, conceptualize the joins you need to perform. Working on Lenovo laptop... Pandas not installed. conda install pandas Okay, and now actually turn this directory into a git repo... okay, good. Dare I put this in Github? Okay yes, this is key, really. Done. Wow, that makes it feel real. That's a big commitment. Next? Develop menus.ipynb. Load every CSV file. Wed Jul 6 07:37:21 EDT 2016 Wow, even just doing the data joins is a chore -- mostly because it's only my second time doing this sort of thing in Pandas. Load ALL the files, not just the millions-of-lines one. Okay, done. Amazing what that does for one's sense of accomplishment, alone. Set the column names for all your CSV files. Okay, I've got them all loaded, and I've got all the column names set. Try loading the one with a million rows as just a single object without chunking. Okay, that's still blazingly fast, even without chunking. Delete the chunking logic for now, in order to maintain focus. Move it somewhere else. Wed Jul 6 08:30:18 EDT 2016 Wow, the SEO Pulse Report CSVs just seamlessly generated from VirtualBox running my Kubuntu image (with slightly less memory assigned) headlessly. Big win! Buys a lot of time for Tech team to tackle AND keeps me from getting knocked out of the zone screen-switching. It's critical that you document the SQL statements that were used to create all these CSV files. Done. Doing the joins, and it's pretty clear I'm not going to automagically generate a hierarchy out of this, and time's running out. The meeting with Cynthia is at 11:30 AM, and you want to be the master of this meeting, not a squirming didn't get it done guy. Okay, trying to do some joins natively in SQL. I've got some tricky tables to join. Ugh! I may have to limit these somehow. Okay, lesson being learned is similar to the last time I had inner joins not coming up with a big enough number -- watch how you do the original data pull if joins are going to be done off-database, such as in Python Pandas. Why not just keep it in SQL for as long as you can, if your querying layer happens to be an industrial cloud database, like Amazon Redshift. Just gritting your teeth and waiting for a query to finish executing where you can tweak the query and re-execute is way better than pulling what you THINK is good source data-pulls off of the system and then coming up empty handed. I couldn't even imagine making these attempts if it were not for my Scala SQL Server experience. I really learned SQL there. I need to switch from SELECT TOP 100 type syntax to LIMIT 100, which is more PostgreSQLy. My current query is up to 5 minutes. Interesting. I hope this works. It worked with 2 of the tables at about 3 minutes, and I just added a 3rd table. I hope this doesn't time out, because if this works, I think I have the data that I need. A lot of my work this morning on loading CSV through Pandas will have been wasted, but I'll be standing on top of much better data for the project ahead of me. I see messages as it executes about loading rows. Interesting, it's only up to 3060 rows after nearly 7 minutes of execution. I think I need to let this just continue to execute. I limited it to 250K rows, and it's currently reporting 4960 rows loaded so far. This could take awhile. I am beginning to think that there are a lot of URLs not mapped into categories. There are probably much more efficient ways to churn on this. I think I'm going to have to stop this query. Get ready to do a 1.5 hour hail mary play to be ready for your meeting. I need a big shift in thinking. What you give your stakeholder today is enough for her to continue designing, and for ME to go back and really make sense of this data in my head. Crap, okay, 1, 2, 3... new project... 1? What I'm dealing with in the supernav is more purely hierarchal issues than the stuff being promoted on the lively nav. And so focus on THAT table. -------------------------------------------------------------------------------- ## Wed Jul 6 04:14:18 EDT 2016 ### Get Into The Office Looks like I have a merge in my future. I know I did journaling from work yesterday. But this probably sums it up nicely, anyway. It was a free/found day, because my stakeholder for my next project wasn't in the office, but still I did not finish the work. And it's all still before me now -- but it's for the best. I'm taking an approach that I would not have otherwise taken. Now, I just have to do it. It's perfect focus-time and my only distraction is me -- and Billy, my cat, who demanded his cuddle-time. Very therapeutic. I can see what Billy cuddle-hour was all about. Billy is very sensitive to the need for things. We all draw energy from somewhere, and it is very possible to draw energy from Billy. Okay, enough of that. Time to think about the work ahead. I am in some fairly new territory, which is the reason for some of my difficulty getting off the ground with this project. It has now taken on that feeling of a college project that I procrastinated on and have to do overnight. I lack the muscle memory habits for this particular project, but I have to form them. This is homesteading in the noosphere. It's love-worthy for that reason alone. Data-informed navigational menu crafting. 1, 2, 3... 1? Do a conda update conda, from before you do a source activate journal. Updates include going from Python 3.5.1 to 3.5.2. Interesting. A lot of eyes are on Anaconda right now, I think. Wed Jul 6 05:23:50 EDT 2016 Ugh. Best strategy now is to go into work. Continuity. Enforced focus. Get the train ride out of the way ASAP. Go!:w -------------------------------------------------------------------------------- ## Tue Jul 5 09:03:45 EDT 2016 ### A Day of Distractions... Ugh! Still amazing how smoothly the transition from weekend/home writing to work writing occurs. Continuity of thought is wonderful. Now, if I only had continuity of work. Don't beat yourself up over it. This 4-day holiday weekend was all about the power of putter, in which I am an increasing believer. I remember puttering throughout my life, and how... ring! Call from Rachel. Okay, not bad. A request to get out of work early. I'm still conditioned from before we were separated. Do the necessary report-checks. The CSV files generated properly (woot!) and now look at the reports themselves. And be prepared for showing Marat how to run the Bad Bounce Report for himself, and showing how to extract platform findings. Ugh! It takes so long to go across the tabs in the pulse report. But it really does give you a nice overview of the sites. I should say something interesting about each property every day. I'm sort of on the verge of having a secondary journal again. I always want a secondary (and even tertiary) journal all the time, but never came up with a good and continuous wand private enough way to do it. Ugh, I have to move faster. The best thing I can do is this menu work. Okay, I have to tackle this problem correctly, and I don't want to take the wrong rabbit-hole chasing approach. I didn't do any of the work that I wanted (needed?) to over the weekend, but it was a July 4th holiday weekend with Adi, and I don't need to feel guilty about that at all. Don't think less about yourself because of the pressure being put on you. Plow through this. Hmmm. Take a look at the files you saved out on Thursday that sort of stage you for a rabbit-chase, but don't actually chase it. Take inventory! Look at your potential approaches. Use the 80/20 rule and buy yourself a lot of time. Fire off an email to the boss about being late on the weekly report this morning. Nope, get him a weekly report, but quickly. Done. Tue Jul 5 09:56:22 EDT 2016 And onto the main event. 1, 2, 3... 1: LOOK AT YOUR THU FILES! Ugh, see if you can't get VirtualBox to run headlessly. I keep getting thrown out of the zone by switching virtual screens (VirtuaWin on Windows 7). Okay, I am starting the virtual machine headlessly. Ugh! Get in with the default Remote Desktop app from Windows 7. Okay, Kubuntu is running headlessly, and I'm going to leave it like that for awhile without trying to RDP into it. Tue Jul 5 11:03:50 EDT 2016 I promised something by noon. That leaves me an hour. So, work backwards, from what it should look like and go BACKWARDS to the data... oh wow, just checked on my stakeholder, and it looks like she's not in today. Do this the right way... the puttering way. Let the data suggest itself. But still, work backwards! Tue Jul 5 15:11:55 EDT 2016 Distractions, as usual. This time, it was adding platforms to the bad bounce report. Not too difficult, but everything has overhead. Ugh! Have a new meeting scheduled for tomorrow. Don't let today's crisis slip into tomorrow. Focus! 1, 2, 3... 1? Make this so friggin' cool, you WANT to work on it all night. But don't MAKE YOURSELF. Dive into this, not like it's a to-do item hanging over your head, but instead like the most love-worthy work you've ever encountered in your career. I mean, IMAGINE what I'm being asked to do! Hmmm... okay... make SOMETHING JavaScript appear in Jupyter Notebook... and THAT's the cool place to begin. Work backwards from cool. -------------------------------------------------------------------------------- ## Tue Jul 5 00:38:35 EDT 2016 ### No Edits! Even Here Holding down a full-time job AND taking care of everything on the homefront, even given as much easier as it's become lately, is still a tall order. I'm exhausted after I get home from work everyday -- just like Dad used to be. I don't really want to do anything but sit still, and be glad I got through another day. The last thing I want to do is tackle laundry and start getting the place in order to be presentable to be sold. And with this 4-day weekend, which Adi spent with me at the apartment for the most part, I started making a dent. Even with Adi's intense full-on routine, I was able to get scraps of time here and there to clean and putter and organize. I even got Adi into some of it. There were tense moments as she wanted to spread the coins all over my bedroom floor again, and I can feel the stress welling up even as I describe it, but I was able to transition that into an exercise of using those coin rolls -- which those coins were all once in and emptied out of once, much to my chagrin. Now that they're in rolls again, they're going to stay that way. But all-in-all, this was another astoundingly successful weekend with Adi. She skipped the July 4th fireworks, which were probably terrible because it started raining at just about 9:30 PM, in order to dig in deeper into her new Nintendo Yo Kai Watch game that she just got at Walmart. Now, I just have to be ready for tomorrow. Don't kill yourself. Take care of your body first. So, go to bed VERY soon and wake up VERY early. Oddly enough, even though I didn't get nearly as much done on the work-front as I had hoped this weekend/holiday, I still feel that I have made significant progress of the most important type to my life, of late. And it is not just this thought stuff. I did a true 80/20 Rule 1st pass at my place. I need to do a second and a third. One night can make all the difference, but then also, I'm a human mortal, aging and wearing. I need to do this all according to a sane plan. Don't kill yourself for work. But then also, make some of the sacrifices you need to make to keep and thrive at that one killer job you landed that you're good at, that makes you happy, that you could see doing in some variation or another for the next 10 years. Grab the reins and make it go the way you need it to go. And that means not flubbing it. Don't be the roadblock. Be that unbelievably on top of things fellow who thought about every angle in 3 different ways already. Be the lightning bruiser. Be the smartest guy in the room, all the way through implementation and support, so that people know you're not bullshitting, and that you are the genuine article. Write really dense paragraphs. No edits! Even here. -------------------------------------------------------------------------------- ## Mon Jul 4 14:31:38 EDT 2016 ### Levitron is the coolest toy, ever! Already 2:30 PM on Monday, July 4th! Holiday almost over. Driving Adi back tonight after Fireworks. Making my decision, I'm going to drive her back late after a lazy day, and watching fireworks from the roof. Funny not being at the Catskills this weekend, and unlikely to be there next weekend, because of super hero school -- haha! Awesome. Okay, anyway she's enjoying Mac & Cheese, Captain Crunch, and a melody of other lazy-day classics. Using her iPad for the first time all (long) weekend, because I believe she's very much into my phone (novelty) when she's with me. Did the Levitron successfully today, and Adi saw spinning-top levitation of the sort that some formula or other was supposed to prove was impossible -- just Googled it: Earnshaws theorem. Well, poo poo to Earnshaws theorem. I can't wait until I comb-in all my other media types into this journal -- so much to do: - YouTube - Instagram - Vine - Twitter - Other? Tumblr is mostly just dupes. - LinkedIn is mostly just dupes too. Hmmm, maybe I should deal gracefully with multi-channel dupes, and actually create some sort of second-stage easy-to-query structure. When displayed directly in Markdown, it'll always show the most interesting bit, like the YouTube and Instagram embeds. But it will also show what other channels over which similar content had been pushed out. -------------------------------------------------------------------------------- ## Sun Jul 3 13:40:01 EDT 2016 ### Clean The Apartment for Adi Okay, back on the early-2009 24-inch Mac. This will be quite an accomplishment if she doesn't decide to fixate on this computer as the one she wants to be on. There are enough screens in play now maybe that I can actually use one of the ones for myself that she knows is one she could be using too. That would be a big advancement in maturity -- NOT immediately wanting as her own "current thing" the one thing she sees someone else enjoying in the moment. Some people I think never get past that point, and are only every happy if they can have something they see someone else has instilled value into by virtue of a bit of attention. People with imagination infuse value into whatever they like. I'm very proud of Adi knowing that the real value in things is in "living things" like the kitties, which are not so easily replaced. Places (to live) are quite another thing, but I need to position it just right in her mind. We will not be at this Inwood apartment forever. I'm in no rush now, but I have to move steadily forward -- mostly, in terms of getting organized. And THIS is where my brain gets organized, and I can start to enjoy the benefits of a bit of compounding returns setting in -- without having to suffer through the reset-button getting pressed on me again. This is a novel and unusual and precious time -- for BOTH me and Adi -- so, don't squander it. Let her SEE YOU get the entire apartment under control today with a series of 80/20 rule passes. Priorities: - My bedroom, and organized bags of laundry -------------------------------------------------------------------------------- ## Sat Jul 2 13:56:00 EDT 2016 ### Why all the Pandas? It's not Black & White I'm going to try to do some journaling here while I go. May have to switch to laptop, because any screen I sit down at is the place Adi wants to be and the screen she wants to use. It's a very familiar situation, and I have to solve these types of this better with Adi now than I did in my marriage. My Surface Pro has alleviated this situation somewhat, but it's keyboard setup just isn't as conducive to journaling as a real laptop or a real keyboard at a real desk. No worries. I can make this all work. Sat Jul 2 18:24:07 EDT 2016 Was just outside for a couple of hours. Even then, Adi likes to use the Pandas games on my phone (increasingly impressive), and I like to read about Pandas on my Kindle. We're sort of in parity. What is it about Pandas, and representing things in the digital world? I guess it must be a lot of things, and not simply black and white. -------------------------------------------------------------------------------- ## Sun Jul 3 12:32:21 EDT 2016 ### Get Better at Stuff Take notes as you walk around and do stuff. - Puttering is okay. It's like ruminating. But when you got it, document it! - USE the systems you have -- if they are actually good systems. - Understand that all systems in a human's life are genetic algorithms - Be willing to let your systems evolve -- experimentation is necessary - But you have to kill off the bad system strains and accelerate the good People often don't see what trips themselves up most, because they're too close to it -- like a fish in water. Start treating yourself like the human being you want to be. Start behaving like the human being you want to be. - Dial-down the pettiness - Dial-up the good-daddy - Dial-up the insanely-smart initiatives (ad-parity, decrufter, etc.) - Remember more. Process & distill more. Take next-step actions on more. It's been almost 4 months since Rachel moved out, and still I haven't really cleaned up. I made a few feeble attempts, and am concerned today will become one of those too. Whenever Adi is absorbed in something and I get into the zone, I can feel it happening, then she snaps out of it, and I (am force to) snap out of it too. So, my goal today is to keep her absorbed in things. She just discovered her VTech InnoTab2. Glad it's batteries and not rechargable. Popped new batteries in, and she's off! Use the to-do list immediately above the current journal entry (always) to much better effect. OMG, puttering... WITHOUT being drained of energy by the day or under stress or running on too little sleep... this is how things get back under control. Shit... Adi and control of my life are (were) at odds. The better I can entice her into things... Ugh! I have the wrong charger here, I think. I may have lifted someone's Macbook Pro chargers, though I don't know where from. Hmmm, hope I have a Mabook Air charger here. Anyway, at 27%. This is a good time to commit and maybe prepare to switch to another platform. -------------------------------------------------------------------------------- ## Sun Jul 3 11:55:16 EDT 2016 ### Return To Where You Love Life Most (Kafkaesque) 80/20 rule sweeps are going to become an important cornerstone concept. I need to document those. It's like my metaphors or my findings in life lists. It's the most important logic-layer near the top of the abstraction stack. It's the one that tells me how to efficiently and effectively get stuff done -- if and when I set my mind to it. It's the one that answers the question hanging in the air, after I do: now, break it down into easy steps. Steps 1, 2, 3... Okay, what's step #1? Step #1 is getting my shoes on. I need the freedom to walk around my entire apartment without fear of pain. Okay, done. Now, what's most broken is my clothes bureau. It's splitting apart with the drawers stopping working. Push or pull the ends together in a heavy-handed enough way to just solve that annoyance final (at least for a few weeks). It jams-up and crams-up, because it stops you from doing many things in your daily organization efforts well. It creates the dreaded back-up queues (laundry, sinks, paperwork, etc.). Gradually become yourself again, writing while you go through the process. Make it kafkaesque -- and use words like kafkaesque. ESC b zg A. Continue typing... now kafkaesque is in my spelling okay-word list, which I keep in a Dropbox location, so all my spelling allowances work across pretty much every system I use regularly -- which is quite a lot. Always remember, there's no situation so bad you couldn't screw up even more by continuing to make bad decisions. And don't just remember that -- try stopping making your bad decisions. Have a nice, firm grip on the rudder or steering wheel or whatever sort of stability and directional control devices your vehicle has. Know how to command yourself like a vessel. Go places. Do things, even if those places and things are right in your own home. Embrace solutions, even if they turn out to be half-assed and temporary appearing, if they're actually good and create the desired outcome. Don't be a perfectionist or purist about these things. Give up old systems that no longer work for you. It's like throwing out your last Amiga, and then throwing out all the RCA-cables and BNC-connectors that you used with it. Nice for nostalgia and museum stuff, but terrible for clutter in everyday life. Move on with the world. But don't ALWAYS go exactly where the world is trying to take you, but certainly get the benefit of everybody else having gone there, such as economies of scale. Factor mainstream cheapness into your best solutions, when appropriate. Don't walk away from advantage because that way sometimes smells bad. Be strong enough to occasionally walk and venture where the smell isn't so nice. But don't be so weak-willed as to stay there beyond what's necessary. Return to where you love life most. -------------------------------------------------------------------------------- ## Sun Jul 3 11:47:20 EDT 2016 ### Do 80/20-Rule "Sweeps" Now that I'm thinking about this journal in Data Science and Machine Learning terms, I'm sure I'm going to adjust my writing style here-and-there to really play into the types of algorithms I know I'm going to be writing to mine my memoires out of this silly experiment. The experiment that stuck. This is it -- a one-long-page textfile journal for life. I like Linux, Python, vim and git towards that end, but it's really up to the preferences of whoever adopts this technique is most comfortable with. The one warning I'd say is that it is intended to be for the long-haul, and it may be best to zero-in on technologies whose likely durability over a lifetime of decades is unquestionable. Even if all Unix-like tech goes away in favor of some neural network OS, there's going to be historic archives that instantiate their APIs through booting a Unix or Linux-like system. And once you have *nix, you also have the common Unix commands and data piping, and probably at least vi (in which, you'd be proficient if you used vim or other vi derivatives, daily (or emacs in evil mode)). And so, this is where my mind is at. These little blips of journal entry will be my reward between 80/20 rule sweeps. Now, go do a sweep. Rely upon principle-seepage or osmosis between domains where you've become expert, and domains where you still really need a lot of help. Commit between each sweep. -------------------------------------------------------------------------------- ## Sun Jul 3 11:40:50 EDT 2016 ### Prioritize-up Being Awesome / 80/20 Rule Clean Sweeps It's not quite so bad as other times in my life when stress has gotten to really ruin me, such as when I was running the fucking check cashing place, and I had my sister ragging on me all the time, and dealing with my Dad's death, and suffering from bad advice from family (my uncles / not hiring a lawyer) and (in retrospect, realizing) my mom gradually becoming schizophrenic, and having just graduated feeling derailed right as my life was supposed to really begin. Shit, I've had a lot of fucking shit thrown at me, and not-so-great people around me not really helping me much. But after a point, those people around you are of your own choosing and the decisions you make are more fully your own. And so it is with that perspective, that I parse out my problems, and dispense with the queue as fast as you parse. Work in real-time. 80/20 rule sweeps to fix your immediate environment are not about deep-cleaning. It's about tolerability, and re-stabilizing your own center... and perhaps even, mental health. Yup. That's why the marriage broke. And now, I need to hold up my end of the separation... I am awesome -- when I prioritize-up being awesome. -------------------------------------------------------------------------------- ## Sun Jul 3 10:22:06 EDT 2016 ### If Played Right, Today is Key (and Macs are better) Okay, I'm on the Macbook Air laptop that was Rachel's, but she still has one, and I now have one at work and at home, and so I can make the best of a good situation. Typing like this -- actually on my lap -- really reminds myself of the superiority of a true laptop form over the tablet / part-time laptop form of the Microsoft Pro Surface 4. If it's for casual media-consumption use, the tablet form is definitely superior, but for actual typing-work, these 2011 Macbook Airs are actually far better -- from their keyboard and form-factor (for comfortable typing) to... uh, the full-screen modes of OS X and the 3-finger virtual-screen swooshing... wait! This should be a list enumerating Mac's superiority in these ways. I actually dislike coming to these conclusions, and an old Amiga-hat being won over to Apple so fully doesn't exactly resonate well with me. Anyway, I believe in approximate order of niceness... 1. Holds a charge for weeks even when not in common use (always ready to use) 2. How well suspend & resume works (closing the laptop / getting WiFi back) 3. How well ScreenFlow works with front-facing camera and capturing keystrokes 4. Fantastic illuminated good-feeling low-profile keyboard (as far as they go) 5. "True" full-screen, very Amiga-like in feel, helps getting into the zone 6. Multi-finger left/right swooshing to flip between full screens (Amiga+N) 7. Unix commands like "ls" in the terminal, and behavior like "piping" 8. "Acceptable" Linux-like repo systems like Homebrew (not as nice as dpkg). 9. Mainstream enough to be a primary target for Anaconda & Jupyter Notebook Sun Jul 3 11:34:01 EDT 2016 Just took a nice, long shower and Adi didn't try to call me out of it. It was very relaxing. To have this WHILE she's here is a pretty big breakthrough. She's absorbed in my phone, which is still something of a novelty with her. I'm sure in her mind, Dad is Android and Mom is Apple (as far as phone's go). This laptop is surely going to be my key to being effective today. Today is my key to getting organized again in my life. Don't squander it. Yesterday wasn't great, and tomorrow won't be great as yesterday both Adi and me unapologetically relaxed. We went to sleep at a reasonable time, and... yikes! It's already almost noon. But we both ate. Maybe I just let her get carried away in screen-time and toys today while I do all the work at the apartment. Remember, don't go diving down into rabbit holes. This is about the big 80/20 rule sweeps. -------------------------------------------------------------------------------- ## Sat Jul 2 13:42:45 EDT 2016 ### Bank Tiny Wins I have so little time to clean up and organize around the house -- not just the cursory, surface tidying that the cleaning people used to do, but a really good cleaning and organizing that permanently improves things moving forward -- the sort of cleaning I need to do to get ready to sell this place. And Adi is VERY sensitive to me doing anything that isn't either sitting quietly OR playing highly engaged with her. Basically, me turning off to give anything else attention at all is her signal to rush over to me and demand my full attention. It's a bit frustrating, but this is life. Billy the cat is almost as bad. It's just like writing is the signal for them to come and walk on the keyboard. But that's just cats. You can't let those thing be bigger forces than your own human will. But then also, you can't make everything adversarial with your daughter and your cats. There's a way to make this all work, and for me to come out ahead. Bank tiny wins! Bank tiny wins. Of course, the corollary is don't suffer tiny losses -- which I constantly am. -------------------------------------------------------------------------------- ## Sat Jul 2 09:32:55 EDT 2016 ### Yo Kai Watch Metal, Tattlecast Ahhh, had yesterday off. Spent morning thinking about my future, a book, SEO, Data Science, blah, blah blah. Then, picked up Adi from her Grandparent's place and brought her back to NYC apartment, which she hasn't been at for awhile, with all our weekends at the Catskills, and the hand-off usually taking place at the Grandparent's Catskills place. So, she was missing Billy, Sammy and Charlie. She's laying down watching YouTube right now, cuddling with Sammy. She's right now learning about the Yo Kai Watch series-2 metals, and Yo Kai 2 coming to America... woot! Big discovery for her. She described yesterday as the best day of her life, several times. She got the Tattle-Cast Yo Kai that she's been hoping for with every Metal-pack opening since she started collecting. Her Pop-Pop somehow managed to score that for her, and had a wonderful story built-up about how he pulled it off. And just by chance, she got another one that she was looking for in a pack I happened to bring with me. They sell them in Duane Reade now. Sat Jul 2 12:18:03 EDT 2016 Already past noon. Hard to believe. But having a great, relaxing time. Suction cup bow & arrow, and dolls and kitties and Yo Kai Watch and just about everything comfy-cozy. Adi takes after me in that way. -------------------------------------------------------------------------------- ## Fri Jul 1 09:26:31 EDT 2016 ### I am Different. Python is Different. Finally getting started this morning. Beginning to write without a git pull, because the city apartment has no Internet at the moment. Will be an interesting day. Get ready for Adi here this weekend AND get ready for Tuesday menu delivery work. Fri Jul 1 11:37:39 EDT 2016 Took long bath, wrote important article. Internet back. Will commit and merge, then paste the article I wrote. First capture hot thoughts. Time to really start shaping my book. Latest thoughts. Strong nicknames/addresses! Mine? Follow the 1 + 10 + 100 + 1000 model. That makes for a good pyramid to visualize, just barely fit-able on a desktop screen with some text-label info. It goes something like: - 1: Homepage - 2: 10 Visible Top-level Pages - 3: 100 Hidden Top-level Pages - 4: 1000 Long-tail Pages - (5: 10,000 Archive Pages) 1111 pages easily live in an Excel or GSheet, and can be thought about as individual entities. If you chop-off level-4 for mental exercises, you only have 111 pages to keep track of in your mind as your most-important real estate of your site. These pages, due to the way most navigational techniques work, actually have direct links to them on every page of your site, by virtue of primary navigation actually cutting across every page of your site, and that next-level of menu selection that I call level-3 here, actually cutting across every page of your site as well. They're usually just hidden or collapsed or accordion'd-up in a dropdown menu. So, that selection of the 10 pri-nav and ~10 dropdown items per pri-nav element is all-important. These are your (including homepage) one hundred and eleven most important pages, and correspondingly, one one hundred and eleven most important "core target terms" -- which I'll talk much more about later. Mine? Most likely: - mikelev.in - mikelev.in/tech - mikelev.in/book - mikelev.in/show - mikelev.in/data - mikelev.in/poems - mikelev.in/tools - mikelev.in/linux - mikelev.in/python - mikelev.in/vim - mikelev.in/git While some of these top-ten items feel like they should be nested inside each others hierarchy (such as linux, python, vim and git under tools), I am deliberately not doing so. These are the terms that I want to "tweak" up in the most surface, visible way on my site. They truly are my primary-navigational items, and out of a sense of purity of example, I am limiting myself to 10, even though I could think of quite a few more. There is just so much to say about these particular 4 tool choices, that I see a need to promote them up to the very top level. The message here is that this site is mostly about technology, with an emphasis on education, and a strong focus on very particular tools that come into play in technology and education. Probably blaringly missing here is my primary career label these days, SEO, and that's deliberate too. I feel that I'm gradually promoting myself up to Data Scientist, just as I did from Webmaster to SEO before that. It may sound a bit pretentious, but it's always really been my deep-down dream to be a scientist, contributing something to this world, and now I see a path. And since all my URLs are short and snappy, I figure /data is a good proxy for /datascience or /datascientist, neither of which come off as quite as good of a url as mikelev.in/data. And yes, writing like this in some of the most valuable time I've ever found myself having over the past 10 years DOES qualify as more important than cleaning the place up in preparation for Adi, and selling the apartment. I have to light a fire under my ass, but this stuff is time-sensitive too. I am vomiting a bunch of important thoughts out that will change my life forever -- no small thing. Fri Jul 1 12:02:35 EDT 2016 Just got the call from Eva/Adi about when I'm going to be up there... hahaha! Okay, reality intrudes, and time that I had was squandered, bathed, ideated away. But it was a worthy trade. The actual menu work for work will have to be after Adi leaves or during the night while she's here -- but I'll be in a MUCH better position to do this hierarchal stuff after this round of work. You're getting powerfully into the "mode" and mind-set to discuss this stuff. Okay, just dump your awesome bathtub book writing here, and then get on your way to the Adi's grandparent's place in the Catskills... Note: When you publish a YouTube video, look at the other Videos that Google tries to auto-play after yours to understand how all their Machine Learning thingies have categorized your video. BAM! Book stuff: # Python is different ## PART 1: OTHER POPULAR LANGUAGES No matter what you've heard or think about the Python programming language, it's different than what you expect. How it's lumped in as one of the "P" scripting languages of the Web like PHP and PERL is very misleading, as it it's seeming rivalry with the trendy newcommer, Ruby of Ruby on Rails fame. In fact, I'd go so far as to say that unless you're a language implementator, all these are quite a bit different than what you think they are -- only Python more so, and in all the good ways that make it amongst the wothiest and most useful of all to know. This article is a fairly geeky dive into how I found and fell in love with Python, and how I believe that if you're a journey(wo) man with a similar quest to find your one true native machine-tongue, you've found it in Python for the forseeable future, and quite possibly for the rest of your life, and maybe even into the lives of your children. I know I'm teaching my 5-year-old Python the minute she can graduate beyond the more visual and kid-friendly Scratch and Bloky. To understand why I feel this way, one must look at some of the other major languages. Let us start with the two biggies in everyday use out there that will also land you a good job with a decent salary as a programmer, which are NOT considered a scripting language -- a distinction that's becoming much less meaningful with every passing year. My assertion that Python is itself the BEST language to learn (not the best language there is), is of course a bold and inevitability controversial statement. It's not like I've programmed in every single programming language, but I have sampled quite a few and sincerely tried to get over the huge hump of a learning curve in everything from Assembly to C++ to Java and C# (Microsoft's premiere .NET language, pronounced C Sharp). Such languages have defeated me in terms of their intense, continuing commitment they require even just for basic proficiency. My .NET/Java defeat are just two in a smorgasbord of languages samplings I tasted in what I now recognize as my quest to find my one true native language to speak to machines. Other languages I took much better to have been somewhat marginalized by time, and eclipsed by Python itself, including AREXX (REXX on the Amiga), TCL, and VBScript. And while I don't program in LISP or Haskell, I've been convinced that the highky meta/functional perspectives they provide are a necessary part of anyone's big picture, as is poking around with the ultra-new concurrency-centric riffs on C known as the Go Lang and RUST. MUCH of my professional programming time had been spent on Visual Basic Scripting Edition (VBScript) on Microsoft's first sincere offering for the Web, Active Server Pages ( ASP) circa 1998, which is the now unsupported tech behind all those .asp pages on the Web. It's been replaced by C# and VB.NET, which are all those .aspx files. I didn't make the transition along with Microsoft because I despised their coding environment called Visual Studio, and their attempt to separate templates from programmatic logic with "codebehind" filed, and their attempt to solve the Web persistent-state conundrum with techniques called viewstate and postbacks. Ugh! Ugh! Ugh! I could viscerally feel Microsoft's cluelessness when it came to trying to keep casual developers like me on the Microsoft bandwagon after the fool-me-once debacle of pulling the rug out from under VBscript, after my investing my early-twenties career years into ASP. It was a bandwagon fad, and I was stung by it. Everyone who wanted to preserve that "everything is easy" feeling of those early VBScript on ASP days flocked to PHP on the then-mega-popular LAMP-stack -- standing for Linux, Apache, MySQL, and (PHP or PERL or Python). Per the encouragement of my then-mega-popular mentor and still-friend, Gerard Bucas (founder of the Commodore spin-off company Great Valley Products (GVP)), I tried implementing my very Ruby-on-Rails-like VBScript framework on .NET, and failed miserably. I was reeling. I was ready to quit my decade-long work-experience with Scala, serving alternatively as their print production manager (in the days of boxed software) and Webmaster (after quitting for a couple of years and returning). My ambition and self-image was running high, and my Microsoft experience was knocking me down. I recognized and began to hate quite how "opinionated" .NET was with their heavy-handed prescriptions for how to tackle this particularly intractable problem or that, and how my own opinions diverged dramatically with hoe they were trying to get me in interact with any relate to the online world They were clearly forging a path for their armies of wysiwyg-handicapped VB desktop app developers to flock onto the Web. We ragtag scripty early adopters of Web Dev were clearly not the priority, and felt it. Only later did I recognize the smell of a rushed-out half-baked overly prescriptive desperate attempt to stem the tide of a growing interest in the non-Microsoft tech from Sun Microsystems, called Java. Java did many modern things first in a language rigged to be already-comfortable to C-developers, like automatically releasing previously used but now unused memory back into the pool of available memory for running apps, in a process called garbage collection. Java also implemented a sort of computer-inside-a computer, called the JRE, or Java Runtime Engine. The Java JRE is a bit of software that implants a translation layer between the capabilities of the native (real) hardware and OS, and a fake (virtual) lesser computer, but built to very specific specifications (for compatibility, performance, security, etc.) running inside. Java programs get cinpiled-for and run-in the JRE, and not (as such) on the native hardware. Copying the Chicana programming language in a transparent attempt to siphon off C, C++ and other C-derrivative languages developers with a memiry-protected language who's compiled binaries can run on the JRE of any host machine was Java's big contribution to the scene. Yawn! But somehow this spooked Microsoft into aping Java point-to-point. Microsoft's JRE is the Common Language Runtime (CLR). Microsoft's aping of Java (which itself is an aping of C) is C# (C-Sharp). About Microsoft's only high-level original thought was a very slick IDE (integrated development environment) to actually write the C# or VB code in, the previously mentioned Visual Studio, then called VisualStudio.NET until everything in Microsoft-land became .NET and they dropped the suffix, with which Java's Eclipse IDE couldn't compete, because a seldom-acknolwdged shortcoming of Java's whole approach is that things which could and should look like "native" apps didn't. Somehow, desktop programs written in Java for Macs and PCs (all Java code is portable, remember) were all always a little bit clunky and unnatural, as any user of Limewire, ScreamingFrog, or any other desktop client software written in Java can attest to. Punchline: Java's IDE, Eclipse, suffered from this same crappiness, providing VB sheep plenty of reason to not switch to greener pastures. Then Oracle bought Sun, getting Sun's still-proprietary Java in the deal, and embroiled Google in billion dollar lawsuits over a out 20 lines of code in something that should have been free and open source in the first place. Somewhere in there, Java's epitaph is written, as something like "Much less special than people thought, and a big mistake to copy so precisely". And so, the glorious light that was Java dimmed to soft glow, and along with it the silly propaganda hype of calling Java and .NET the two "Enterprise Class" platforms, somehow relegating everything else to some sort of second-class also-ran. Nothing could be further from the truth. Everyone stuck on Java and .NET today are there precisely because Sun and Microsoft (and with Google choosing Java for Android development, Google too) stuck you there, and strong mental barriers slowly grow like crystals up around your open mind and free though, until all that's left is a very closed-minded, and only those thoughts running there that Sun and Microsoft saw fit to put there. We ourselves are programmed by our adoption of and learning investments into our programming platforms. This is not always a bad thing, when the programming language's (platform's) assumptions, paradigms, and ultimately OPINIONIONS of how you should be thinking, acting and coding ultimately align well with and enhance your own behavioral predilections. And when the fit is perfect, it feels like having superpowers. -------------------------------------------------------------------------------- ## Wed Jun 29 20:18:36 EDT 2016 ### Perpendicular To Everything We Know I get home, and I immediately want to do a git pull, and quickly deal with any merge issues, and get journaling. I see the codes inserted by git. I'm going to make them render as code, so it looks good as markdown. I'm going to be writing some good stuff. It might be tremendously dense like this, but I do think I finally have the makings of my book, and I feel really compelled to get it down, dense and rough as it may be. First, we Peel away layers of perception, and then layer back on new perception pattern-recognition models. Look for different types of things with different types of senses, and respond differently, according to some set of genetic learning algorithms, so you know what to not do again, so you don't almost die again. Or else, you die. Simple formula, really. Dawkins argues it well. No reason for me to go there too. I 99% agree with him. That last 1% is God, or some singular higher-dimensional being that might as well be, for whom we are a cross-section shadow, it somehow managed through great personal effort to come down and walk with us a few times. Or that's the belief I come to it with. Go google about Dawkins and Michio Kaku debate this God existing stuff. Michio Kaku is right. It's turtles all the way down... or up, for that matter. Can't beat that problem of induction, Atheists. All the scientifically backed-up observations you can make still can't trump the fact that your senses may be lying to you, and you can't prove that they're not. All experience is subjective anyway, so who's to say -- without true group conciousness? Ah, group conciousness -- a modest goal, and some strong evidence that would weigh in favor of a common objective universe that we all occupy... oh wait, or does it? Nah, induction again, dammit! How long are we mortals going to have to wait before we can come up there and walk with you, saving you a little bit of effort of finding us in all that foamy bright matter. At least, this is more-or-less the perspective I come at it with. If you didn't want to think we were in a virtual simulation, you shouldn't have made the speed of light and quantum randomness work the way you do. I think we've got you here. Particles do go somewhere when they pop in and out of existence. It's called the fourth spatial dimension -- or the fifth-dimension, if you've already filled the spot for the fourth with your monkey notion of a derivative effect of being an observer from inside the system. No, the fourth spatial dimension is like length, width and height, only out at a 90-degree angle from any of those, just like the interior of the good ship Tardis. We're talking hyper-cube tesseracts and hyper-sphere n-sphere. I think the equivalent of Eric Graham's "The Juggler Demo" of the Amiga days, using a pixel raytracer equivalent of the SSG program. SSG stood for "spherical solid geometry" and was used to create one of the most important computer visualizations of modern times, in a looping about every second 30-frame juggler sculpted from spheres and sphere-tubes, and juggling a bunch of reflective mirrors. Somebody do that with n-Spheres, will ya? 'Cause THAT would be a neat visualization, and will give us a little practice for walking with the big ol' one up there. Out there? Perpendicular to everything we know there? But that pinhole poke to hit such-and-such a time and place and point is no small trick, even for God. We can't put limitations on God, I know. But look around us. Now, go read Rendezvous With Rama and return. Got it? The Universe is a lot more like a NoSQL database like MongoDB, than it is a relational data base management system (RDBMS) which generally implies tabular data, except maybe with PostgreSQL and a few others that can do stupid multi-dimensional array tricks, and is the reason for the aptly named ndarray object in the CPU-bound but otherwise high-speed Python NumPY matrix analysis software that can still operate efficiently across parallel processors none-the-less, if you use the Ufora Python "interpreter"... but interpreters are all really virtual machines these days, and so it takes us to some of the original CompSci insights and a world of infinite possibilities, including AmigaDOS and CP/M and ITS and TOPs and VAXOS and a bunch of others, until along came BCPL and C and Unix... then Linux... then Unix again. Now, sprinkle in microcontrollers with tiny high-level language, like oh say Python, embedded in similar to an Arduino, but more like a Raspberry Pi, for devices like the Onion.io and the Internet of Things (IoT). And then, there's robots. Let's not forget the robots, and the machine learning, and the datacenters in a shoebox, and scalable spark approaches that can spontaneously spawn large-scale neuralnets, by virtue of reliable local code execution environments <pre> <<<<<<< HEAD </pre> -------------------------------------------------------------------------------- ## Wed Jun 29 08:01:37 EDT 2016 ### Adi and the City Getting ready to head into work. Just putting this here as a reminder to not ignore getting the apartment in order. Adi will probably be here over the weekend -- picking her up FROM the Catskills this time. She misses the Inwood apartment and the cats. I have to let her see Billy, Sammy and Charlie. But I also want/need to start getting the place fixed up for sale. Solving difficult simultaneous equations again. <pre> ======= </pre> -------------------------------------------------------------------------------- ## Thu Jun 30 17:40:19 EDT 2016 ### A Deeper Dip into Corporate Data Than Before I need to take my Menu work home with me, but also make it LOVE-WORTHY... separated from the newly delicious SQL tables I have to piece together hierarchy. I also think I might like to do it a bit from a giant list of URLs through Pandas. Thu Jun 30 18:12:22 EDT 2016 Okay, I just created my first matplotlib data visualization. Now, all I have to do is bring all the data home with me that I want to work on -- possibly tonight, as I alternate between that and cleaning up around the apartment. I really helped a co-worker quite a bit today with that Pandas data chunking problem. Useful for me to learn a thing or two as well. Thu Jun 30 19:11:47 EDT 2016 Okay, heading home now, but have everything needed to construct site taxonomy menu from core database. Can layer on search frequency and performance data from home. Really doing a hail mary here, for sure. I think I can do it. The menu has to feel somewhat familiar, and pass the "smell test", which I think the last one didn't, because I wasn't using the "real" taxonomy data to start with. I am now. I did a bunch of CSV file exports from SQL, and I'm going to be using Pandas for all my post-processing -- even the joins! -------------------------------------------------------------------------------- ## Thu Jun 30 13:28:34 EDT 2016 ### You Can Chunk DataFrames in Python Pandas to Process Big CSV Files Focus! I'm at a very interesting point now, where I have this journal repo in use on both my personal Mac and on my company PC. I should be able to commit just this file here, and just that file (the .ipynb file) there. Commit this right now and merge the two. Okay, done. Just do some pure exploration in the tables. Oh, I'm showing myself as a SQL Server person with the Top X way of constraining rows. I need to learn to use Limit X at the end of the query, as is the PostgreSQL and MySQL way. Thu Jun 30 17:05:45 EDT 2016 Just did some work just now on applying Pandas against a vary large source data file, employing chunking to process it in smaller bits: from time import sleep rowsperchunk = 250000 chunkstoprocess = 50 filename = 'bigfilein.csv' outfile = 'smallfileout.csv' import os try: os.remove(outfile) except: pass finished = True for index, chunk in enumerate(pd.read_csv(filename, chunksize=rowsperchunk)): sys.stdout.write('%s, ' % index) chunk['D'] = chunk['colname'].str.extract('keyname=([\d]+)') chunk.drop(chunk.columns[[0,1,2]], inplace=True, axis=1) chunk.columns = ['longstring', 'extraction'] chunk.to_csv(outfile, mode='a') sleep(2) if index >= chunkstoprocess: finished = False break print("Finished: %s" % finished) ...and I also don't want to forget what makes VIEWING this while it's processing possible: class Unbuffered(object): def __init__(self, stream): self.stream = stream def write(self, data): self.stream.write(data) self.stream.flush() def __getattr__(self, attr): return getattr(self.stream, attr) sys.stdout = Unbuffered(sys.stdout) That allows you to view things WHILE it chunks, instead of waiting for all the chunking to finish before the cached buffer dumps to the browser -- ultimately, as unsatisfying as not chunking, though much less memory demanding and prone to freeze up your machine. -------------------------------------------------------------------------------- ## Thu Jun 30 09:43:07 EDT 2016 ### SEO Science - Let's Get Meta Okay... debugging the Keywords process for one property not running this morning. Caught it! I want to put better debugging into my program. Switched it from 2 character indents to 4 to stay better in compliance with pep8, if I'm going to be going in there a lot with my new .vimrc set the way it is... which I'm not, so... so whatever. Gotta go to 4 spaces, even if I'm just in there occasionally. Make today super-effective. We have a 4-day weekend coming up (amazing!). Maybe we'll watch fireworks from the building roof with Spider. Wow, gotta think through this whole Inwood thing. If only I were making more money, I could keep the place. Long way from Adi these days, but it is where she grew up, and it is worth a try -- if I don't consider what I've done so far already as trying. I need a money-making scheme that will just work so perfectly without syphoning off any energy from work, but rather ADD TO IT. May ideas a percolating. But for right now, get that Keywords tale in that property's report filled-in! Got the first emails out. Thu Jun 30 13:15:34 EDT 2016 That problem was fixed. Had lunch. Gotta get and keep my mind in the game. This is my window now for the menu work. You want this to be brilliant. I am a technology employee that am applying my techniques to smart menu construction. I will be with my daughter for most of tomorrow, and for most of the weekend and through Monday, in all likelihood. With machine learning improving, crowd sourcing is on the edge of dead, so says one of the Partially Derivative guys. Mechanize. Here's that writing from my phone from yesterday that I forgot to paste in: SEO Science by Mike Levin Let's get meta I am an SEO. That means, search engine optimizer, for those not in the elite circle of ruffians who know. Congratulations; now you are. As I type, I am generating a unique grammer signature. By publishing it online on a unique resource found at a registered Doman name that is strongly associated with me, so I will be known to all who run stuff against the ancient historic Github repository of all human knowledge that first Google, then The Internet Archive (waybackmachine), likely the DOD, and now almost everybody is keeping, will be keeping, in good data-walked and DNA-mapping-like gap reconstruction techniques blending in every newly found archive to digest. The gaps and the versions get filled in, as if git, but more like Mercurial (the hg nearly command-compatible, yet still very different almost native Python one-time competitor that lost. But it's cool. You can query it like Google and stash your private changes directly into the main repo without branching or altering the main head. I don't think I'm going to be able to hide my writing style or identity from Google or anyone with that process that likely outed the Bitcoin author. You can't trick the machine -- so, don't expect to. Let, I'm an SEO. Chapter 1: 40K-foot View Notice, I'm not giving you the satellite or even boson-camera view of things. I'm just giving you a gravity-bound, fossil-fuel burning rise-of-tech era barely better than riding tyrannosaurus vjew. And I make you aware of the fact that that's what I'm doing. Wow, that's meta. Data is beautiful. Reality is beautiful. Our ability to see beauty and feel life is a friggin miracle. Out related ability to actually not fall apart and become dysfunctional is a bigger miracle still. Making it to a ripe old age between 80 and 100 to merely die of the parts wearing out is the biggest miracle of all... Oh wait; no, maybe it is our ability to encode and communicate and transfer knowledge and information and experiences (data) from generation to generation, to keep us alive in data sampling of product, and the recipes and ingredients to more or less reconstitute it. Same DNA. Probably a new soul. Who knows? So clearly, I'd rather be writing SciFi along these lines -- the rise of clone clans as evidenced by the multiverse peek known as Ann Leckie's Ancillary Justice novel, or the creation of a new type of legacy systems based on solar-charging C3P0-like machine/human interface robots that are built very rugged and able to hibernate. It's greater than my personal ability to implement, so I will entice all you youngsters to do so. So go ahead and try to take over the world with your clan of clones and your robot armies to back them up. I'll try to equip you with all the beautiful, fundamental, and profoundly enabling tech tools that are really all around us We're meatsacks banging sticks, setting off nuclear reactions and coping with speed of light size of Universe questions, and now boorish lyrics competing with each other for online audience cell time, so we can turn around and sell a portion of that audiences' eyeball attention to advertisers. That's our (my current) biz. I will use this to generate some more revenue for myself personally, while I try to do it to new heights with/for my employer. Alliances! Synergies! But still a bombastic and boorish attention-whore. I'm sorry. I will redeem myself through sharing what I do with Adi's homeschooling education along these technology lines. -------------------------------------------------------------------------------- ## Wed Jun 29 10:47:54 EDT 2016 ### Derailed by Phantom Pandas Having played with tmux for a bit yesterday, what I can say is that I'm glad that I'm getting to know it, and can consider it part of my familiar toolset, it definitely adds tons of unnecessary intellectual overhead to simple session management, if your terminal sessions are local, and don't have TCP/IP disconnect issues with remote servers. All those Ctrl+B's are completely annoying, and that one configuration setting I need to add to make switching between panels mouse-sensitive at any new machine I sit down at and want to use tmux on is annoying. Also, I really don't find myself needing to keep those long-term sessions connected and accessing them through the tmux server. I totally understand this is possible, but so far, this is not something I need to do. In fact, in a way, my getting familiar with tmux could not possibly be timed any WORSE, considering my pretty big move to running Python locally on whatever machine I'm sitting at, rather than on some remote server somewhere, where I'm confident I can run only one version of Python, that all the dependencies of all the packages I want to use will be fetched and (occasionally) compiled and installed properly. Anaconda sees to that, but my optimism on the Windows platform was premature. I'm trying to use conda's equivalent of virtualenv, and the "source" command can never be found. This is obviously simple PATH environment variable stuff, but on Windows, environment variables are never simple. I wish Windows itself had virtualenv. Wed Jun 29 11:34:35 EDT 2016 Feeling the advantage of the journal being on my personal laptop, which I take around from meeting to meeting. Macs totally handle wifi better than Windows (at least, 7). At the Sprint meeting. There's lots of activity today surrounding SERP changes that have been going on a lot today. I should really start positioning myself as sort of a new voice of modern SEO. The faster I embrace the tools and methodologies of Data Science and Machine Learning, the better. My current projects are taking me in by baby-steps. At least, I'm ising Jupyter Notebook and Pandas now... almost, regularly. I'm reading the free O'Reilly Python for Data Analysis Pandas book now, and getting the insight on all the amazing features built into IPython Notebook... wow. Not merely a REPL, and not quite an IDE. Fits into my workflow just right. - Take advantage of the ML nickname of Machine Learning (also my initials) - Do some interesting play-on-words for branding my content, such as Mike/Reduce Wed Jun 29 16:06:26 EDT 2016 Lots of distractions today on one-off SEO requests, and the activity surrounding whether we're seeing another Pandas update, and the more likely scenario that we're just seeing the more frequent, incremental shifts of Panda now being part of the core ranking algorithm. That means Panda-like updates should just be trickling out with new SERPs. However, plenty of people are reporting discreet "Phantom" updates. I think how you label these changes is splitting hairs. There's a black box, and we have to predict its behavior. I'm going to take one more shot at getting Anaconda working correctly on Windows. I'm re-downloading the installer, as it is up to 3.4.1 (I was on 3.4.0). I lost my way between yesterday to today, but understandably. One of my big findings was that I was not going to be able to reasonably run Jupyter Notebook directly from Cygwin, but rather, I have to launch it from a standard CMD window, which I happened to hack Courier New as the default font so that I can stand looking at it. Wed Jun 29 18:23:41 EDT 2016 Just came back from the Happy Hour. Heading home now. I actually got the conda virtualenv stuff working. The trick was to just use "activate venv" rather than source activate venv. Time to commit and go home. -------------------------------------------------------------------------------- ## Wed Jun 29 08:01:37 EDT 2016 ### Adi and the City Getting ready to head into work. Just putting this here as a reminder to not ignore getting the apartment in order. Adi will probably be here over the weekend -- picking her up FROM the Catskills this time. She misses the Inwood apartment and the cats. I have to let her see Billy, Sammy and Charlie. But I also want/need to start getting the place fixed up for sale. Solving difficult simultaneous equations again. -------------------------------------------------------------------------------- ## Tue Jun 28 12:04:42 EDT 2016 ### Even Anaconda Isn't a Cure-all For Windows Wow, it's already noon. The net was down for a short bit causing a brief interruption, and I needed to do a bit of spreadsheet massaging for the boss for some past-work. I like checking thing off the list. I'm working through all my work with progressive chisel-strikes, beginning to make the really roughest outlines of the sculpture. My boss is about as ideal as it gets, actually having overarching strategy for what he's doing with the department -- something I've never really seen done well before. Usually I'm the top-dog in the SEO department making all the big, important calls which is actually not the best, because while I love my work, I am definitely a hands-on implementation mechanic sort of person. Inbound link-building has for example become a bit of a blind spot for me, after Google started cracking down on the practice and you can't just buy your way in. In-bound links are still incredibly important, but beyond organically just encouraging them to occur, I don't really have that many "products" to offer. My boss did and does, and it was a pleasure to see it all play out, and to play my role where my particular Kung Fu really did help. I'm not just implementation, really. I'm more of a digital detective AND implementor. It's a nice thing to be. I feel good about it. It provides me the next round of actually maybe of general interest YouTube content -- the new wave of tools for data-investigation such as Jupyter Notebook and Pandas, that is. I have a huge amount of work ahead of me today and tomorrow, but thinking about it that way is bad for me. The mere existence of well filled-in data should IMPLY the menus I want to use for the website. However, the data is not nearly as well filled-in as I want/need, and so doing so today KNOWING what data structure I'm targeting for the menu will help immensely. Producing algorithmically recommended website navigation menus would be a HUGE win, especially when the data behind it is absolutely clear, intuitive and compelling to everyone involved in the process. Quantify HOW MUCH traffic you're actually targeting and going after, by mere virtue of the menu selections. And now, go have a light lunch. Tue Jun 28 13:25:38 EDT 2016 Also stopped at Duane Reade and picked up some fun stuff for Adi for the weekend. Duane Reade caries Yo Kai Watch metals -- score! Tue Jun 28 14:13:00 EDT 2016 It's strange that I set up jupyter notebook for the first time on my Windows Lenovo laptop, but yet have not touched it there since I started using it on my old Macbook Air. So, all my conda virtual env stuff isn't on my laptop yet, and this looks like the makings of a tip-area near the bottom of my journal, as I commit this stuff to memory. Let me do it one last time here, and then be sure to make one of those things at the bottom. pwd conda create -n venv python=3.5 jupyter Tue Jun 28 18:06:24 EDT 2016 Ah ha! Even on Windows, Anaconda isn't a walk in the park. The path issues become a bit trickier when you actually use the conda virtualenv trick. Still working out a few things. Hard to do the source command due to bad paths. Had to reinstall. Day over. Not sure what the conclusion is, but I'm doing a conda update conda before I leave. Ugh: 'source' is not recognized as an internal or external command, operable program or batch file. Oh, Windows is the Anaconda endeavor even not enough to cajole and coerce you to work correctly? -------------------------------------------------------------------------------- ## Mon Jun 27 10:15:07 EDT 2016 ### Monday Morning SEO Report Okay, weekly report! 1, 2, 3... 1? Think through last week. Check your email and conversations. Don't down-play the importance of various things you were probably involved in. The amazing Mr. Schism has once again started vocalizing his ideas about Levinux, and I'm listening. Gotta think about bandwagons and the viability of personal projects. I see myself compiling QEMU across all platforms precisely efficiently for Levinux -- but that's a super-huge distraction and a rabbit-hole to boot. And so, I suppress this distraction as I move on to get my second coffee of the morning in my last-and-final distraction before cranking out my weekly report. Don't forget: - Those things in GA you're investigating (2 of them) - Those people you helped last week (2 of them) - That attempt of a thing that failed last week -------------------------------------------------------------------------------- ## Mon Jun 27 09:49:16 EDT 2016 ### Be The Real Deal or Get Off The Pot Almost weird doing a git pull and not having updates from home after a long weekend. Just shows me how busy I was over the weekend. I want more than ever to "blend" my YouTube videos right into this journal-stream for providing context about what's going on in my life (again, for MY benefit -- not yours) "for free" using data I'm also concurrently producing over other channels, just in the course of living my life. I also think I'll blend in my Instagram and tweets. One long page of that stuff all dove-tailed together in NOT a mess may actually be a noteworthy experiment and a contraianly delicious demonstration of all the new technologies I'm taking up and becoming familiar with that complement so well my Linux, Python, vim and git workflow. There's a lot of good data to analyze here. What better data sets to become familiar with the tools with than the data sets I generate (am generating) myself? I feel this really starts to pull together all my disparate, but somehow interrelated activities. I am actually living a highly integrated life -- or, at least, that is my aspiration. Adi's homeschool education, by time with her in the Catskills on weekends, my day-job of SEO (transitioning to data science), and this daily journaling and preoccupation with martial arts-like infotech tools and their mastery. Hmmmm, okay. Let's see. Gotta get my weekly report out. Keep sanitizing as you write here, because it's not your actual down-and-dirty daily work-journal where potentially proprietary information flies. Information wans to be free, but the corollary is just as true -- information wants to be valuable. Granted, valuable information may have a short shelf-life, because once it's used, you likely have tipped your hand to competitors, and have to now go about acquiring even newer and better information that somehow yields a greater competitive advantage over the competitors than your last hand. It's an ongoing game. It's the transience of things that makes SOME valuable information valuable, but then it's also the ability to keep a secret for a good, long time -- AND to "lock things down" so that secret information remains secret. That's the kind of secrecy that is difficult to pull-off, but can really make a big difference, and that seems to mostly be Apple's modus operandi. I am certainly not dealing with Apple-caliber secrets, but I do have to watch myself. This captures the free and open source dimensions of my work -- big emphasis on frameworks and generalities, and keeping relevant in my field. Big sanitation on property names, particular endeavors, and anything where a clue-full person could piece together proprietary secrets. But still, I must write to be effective to process my thoughts, and use that writing in a Robert Cialdini commitment & consistency play -- almost as performance-art, to stay compatible with the modern sharing culture and valuable original content production that's so important to the evolving field of SEO. Be the real-deal or get off the pot. -------------------------------------------------------------------------------- ## Fri Jun 24 12:09:11 EDT 2016 ### Google Analytics Query Explorer More work on yesterday's data-pull project. Figured out how to use filters in the Google Analytics Query Explorer. I'm having a good time. This is re-positioning at it's finest. There was too much of a "finality" that I felt with committing Python code on pseudo-deployed testing servers, just to have your data not read/writable using the filesystem of your local machine and have to figure out that step. When Google tries to give you all the data that you have coming to you by virtue of using their products like Google Analytics and Search Console, you have to be smart enough to ask for it and get it in the proper way. And for that, there is hitting APIs. And for hitting APIs, there is knowing all the stupid nuances and in's and out's of the rules they set up. But Google really did give exploration and experimentation a big boon with this: https://ga-dev-tools.appspot.com/query-explorer/ It took me some googling to figure out how to work those filters, with the double-equals (as a Python user, I should have guessed) and the commas between filters. Wow, this stuff is powerful! Because of how technology progresses, the truth is that we are always personally re-platforming, all-the-time. It's just an unavoidable condition of the information-technology life/career/whatever. But as you re-platform, don't let muscle memory and years of mastery of particular tools be a casualty. Linux commands, Python coding, vim text editing and git distributed version control system at your service. Sure, learn all the nifty (usually) Web-based UI's that make life easy, like Github and Gitlab for git, and Jupyter Notebook for Python, and a nice GNOME desktop for Linux, and... hmmm... vim for vi. If you really want to splurge, you can use gvim, but I'd advise against it. -------------------------------------------------------------------------------- ## Fri Jun 24 09:25:09 EDT 2016 ### Don't let yourself be outclassed by Google Hey, that became my headline earlier than I usually write headlines for my entries. But that sums it up nicely. Using Pandas to not be outclassed by Google, who's wrought changes so vast as to induce existential crisis across the dirty long-tail of people who fancy themselves as search engine optimization professionals. Pandas, and Penguin that came after it, and Vince that came before it, and all the many sub-projects that came along the way are Google's way of saying: "Optimize that!". The cat-and-mouse game is won. You have been outclassed by the gradually machine-learning Google. You are just still getting away with a few things today, as Google hasn't quite gotten around to everything yet. I mean, now could they? There's so much information in the world that the very scale of the problem outclasses Google, itself. So, the game hasn't played out fully, by any measure. There's long-games here that few of us have even imagined, and plenty of surprises that are going to come from unexpected not-heavily-financed directions. The one invaluable commodity in this whole picture is... no, not data... is creativity, imagination and insight. It's the ability to see plausible stories emerging from the data that you're looking at, and making connections and unlikely correlations between different, previously unrelated data-sets. It's thinking across dimensions, with different human-understanding-models of an often-core and often-repeating set of concepts, we sometimes label design patterns. Design patterns occur in nature, in the ultimate information systems such as DNA and life, to the simplest patterns, such as crystals. My machines at my desk are set up in a very curious state, and I thought I'd capture it. -------------------------------------------------------------------------------- ## Fri Jun 24 07:35:16 EDT 2016 ### Commit Little By Little Choose your abstraction-levels and API details carefully. In them, you can glean your future flexibility, ability to pivot and change your mind and take different approaches, and do transformations from the setup that was due to the presumptions of the old system to a new setup that caters to the presumptions of the new or altered system. You can do this in small steps and attempt to maintain backwards compatibility. Big choices surround moving your data and re-basing it on new hardware, database engines, etc. There are countless approaches, and all of them are valid so long as they get the job done and don't excessively limit your future options. Always be working on your next step-1 starting point. Insurmountable hurdles will suddenly start to disappear, and the friction and resistance of getting started on projects you've been meaning and wanting to do will magically fade away. You'll find yourself saying, oh, I already did that last time. I can just do this new project that... oh, say automatically checks the email in a particular email account every 10 minutes, and emails you that something has arrived. Just pass on the info at first. Just make the diode or router equivalent of a repeater or amplifier. Data-in, same data-out. Channels may vary. It's the essential trick of network address translation (NAT) in routers, and their optimization routing tables. Something on-the-left is equivalent to something on-the-right, and when something comes along either way, just shuttle it along. That'll be my first project. Capture ideas when you can, but when you can't, don't sweat it. Say them out loud in your head, and associate them with a few things around you, like where you are and what you're hearing. These things will serve as triggers in the future and induce you to have the same (or similar) idea again. Commit now. You may not get a chance for awhile. -------------------------------------------------------------------------------- ## Fri Jun 24 07:03:01 EDT 2016 ### Life and Robots and Adi's Future Robot Tech Project Everything is a system, and every system has its own prevailing architectural concepts, which is to say, it's biased towards doing a certain thing a certain way, based on preconceived notions of how the system should work. You and I as human beings (I presume it is mostly human beings who will ever read this) are ourselves systems. Or more precisely, we are the output of the DNA system of life that has evolved on this planet, which makes lumps of organized stuff that extracts nutrients and energy from its environment in order to go about its business, which is primarily to organize more stuff to be much like itself, but with sufficient permutations introduced into the system to challenge and prepare for unpredictable change, therefore instilling resiliency into the greater continuous system. We as individuals are temporary, but so far as our experience tells us so far, the overarching system of life and procreation goes on forever, surviving extinction event after extinction event. Of course, the ***precise*** details of what survives through the extinction event is somewhat up for grabs, until we get some Noah's Arc plan together so that humanity can hop over such terrible incidents, and come out as still human, and not set back by ages on the other side. Step #1: Do it with robots. That'll be one of my ongoing projects that all my work transitions into; robots that can: - Go dormant for long periods of time - Draw energy from their environment - Re-awaken, based on time or other environmental variables - Have a high degree of mobility - potentially globe-trotting and ocean-dwelling - Scavenge their environment for what they need to self-repair - Be set on the path to turn the self-repair ability into reproduction - House large amounts of data - Provide a variety of human/machine interfaces to make that data usable - Allow rudimentary interaction with humans of any language (sign-language, video images, etc.) - Do housework The list probably goes on, but you get the idea. I don't want a SkyNet here. I want a FidoNet. There was a FidoNet. I loved it. Reproduce that without phone-lines but with robots. The low-tech pre-Internet internet. -------------------------------------------------------------------------------- ## Thu Jun 23 20:20:50 EDT 2016 ### Everything Worth Doing Takes a Little Bit of Work My work at the office today was interesting on two fronts. First, I am using Pandas for things I would totally have used SQL for before. Now, I know what to do every time I sort of wished some data I was sitting on was in SQL, so I could apply join-solutions and stuff to it, even though it lives in a spreadsheet or text-file. Wow, it's just starting to dawn on me what that means... I should go seek out some data. And what better than this very journal? But tonight, I want to do stuff on the home-front. Things like dishes and laundry... very exciting. Thu Jun 23 21:08:12 EDT 2016 Just talked with Adi. A lot about Rick and Morty and why I watched ahead, and how Rick's "Smart" could be detected, and how Morty acted as a shield, and how there was that world with a Rick that used a shield of Mortys, and how that was not a very nice Rick, and how I watched ahead to make sure I knew what was coming up, so that I could explain things quickly and efferently and make sure that anything needed to be talked about was talked about. And then we talked about dimensional scissors and subtle knives and donut shaped universes. It was all very stimulating. She certainly is my daughter. I almost can't believe what I was able to pull off today, with getting MiniConda and Jupyter Notebook installed on my SolidRun CuBox i4Pro. I also got Dropbox to run, I forgot to mention in my earlier journal entry, but that worked too. It's really amazing what that Exagear product made possible, and precisely how all that came together. There is a whole virtual machine running inside the tiny little ARM PC that I brought into the office, precisely for the purpose of getting the daily reports off of running on my laptop on a... drum-roll please... virtual machine. So, why is it okay on my CuBox, but not on my laptop? The difference between a laptop and a server, doofenshmirtz! The tasks on a Linux-based solid-state piece of low-power, few-parts will run regular and solid, while those on high-power, lots of parts, many of them moving -- and all of them Windows -- will be sure to fail aplenty. But to be running that x86 emulator on an ARM platform, sufficiently to get Dropbox and MiniConda to run adequately well... amazing! So, I'm going to run out of steam again tonight. I can feel it setting on. And I have a lot of driving to do tomorrow night. Home is going to be the sacrifice, yet again. Sighhhh. Okay, I need a stronger will, it's as simple as that. This right now is one of my weak-willed moments. I need to decompress and create indexes, and optimize my components to work well again tomorrow. But for most of tomorrow, I'll be at work, and for the rest, I'll either be on the subway or driving. Don't over-complicate too much. But everything worth doing takes a little bit of work. -------------------------------------------------------------------------------- ## Thu Jun 23 19:55:35 EDT 2016 ### Of Pandas, MinMaxScaler from sklearn, and matplotlib At some point, I'll do a screencast of my process of checking in a shell whether vim is running with index.html, and if it is, in which case I'll immediately do a git pull, and if it pulls and merges cleanly, I'll type exit, and drop back into vim, which will notify me that the file has changed and asks me if I would like to Load it, and I answer yes, then hit @j to start making a new journal entry. Next thing I really need to do is start using matplotlib in jupyter notebook. I even have to just remember that package's name. I need a good example. Maybe watch those YouTube videos. Oh, and don't forget to mention how awesome it was to... Gotta always remember: source activate journal ...how awesome it was to do today's work, learning pandas and from sklearn.preprocessing import MinMaxScaler std_scale = MinMaxScaler().fit(df[['rangeofnumbers']]) df['onetohundred'] = std_scale.transform(df[['rangeofnumbers']]) df.onetohundred = (df.onetohundred * 100).astype(int) df = df.sort_values(['onetohundred', 'othercolumn'], ascending=[False, False]) df I also learned how to convert native python objects to pandas DataFrames, and how to sort and filter those columns. The above example was to prepare a column to be used for 2-column hierarchical sorting, which is very related to pivot tables or aggregate functions. All these things take all these different names, but are very much the same thing -- just expressed in different ways, like file systems, SQL databases, name/value DBs of varying BASIC to ACIDIC properties. Spark, Hadoop, whooptie-doo. Just learn to think in varying ways about similar underlying relationships. Yeah, yeah, everything's really just a linked list. -------------------------------------------------------------------------------- ## Thu Jun 23 09:26:34 EDT 2016 ### CuBox, Take 2! And a Real Jupyter Notebook / Pandas Project Ahh, feels good to merge without conflict. Okay, yesterday was only so-so productive, but with lots of learnings. I'm finally trying to put my CuBox into use, but certain things in the report generating process are x86-dependent, and I purchased (for ~$35 yesterday) Exagear, which people are apparently using to get Raspberry Pi's to run Dropbox -- a worthy goal. Sure, Exagear is proprietary, but these tweaked-out optimized VMs often are. I use (and love) VMWare Fusion, so why not support a developer doing such a fringe thing as getting x86 emulation on ARM processors? They port the hardware, so Dropbox and others don't have to port their software. I believe it's based on WINE. This falls under the category of dropping some money to keep continuity and power firing on all cylinders. You're flying high with these reliably running reports, and I don't want to miss a beat. Today has to be all about the menu project, and not merely the data behind the menus, but the construction of the menus themselves. The better the data behind the menus, the better the menus will be. I have to keep a lot of plates spinning, and make-happy a lot of stakeholders. Primarily, my boss, but now also two separate outside-department stakeholders. Look at the data closely that you pulled and correlated yesterday. I'm re-setting-up the CuBox. Dropbox is the main thing that needs Exagear, and I'm going to pursue that again, because having Dropbox on the CuBox is invaluable. But I also may look at installing Anaconda (minconda) on the CuBox to deal with a lot of the same problems as Windows and OS X have with configuration and building software. Admittedly, the builds may take all day on the CuBox. So before I do any of that stuff, it will be important to get the thing running headlessly in the background. What's your best approach for this morning? Get to the Menu work. Thu Jun 23 11:37:43 EDT 2016 Ended up working through the bounce project with the boss. Think I nailed it with his help. I'll get better at this mashup stuff. Lesson learned: look carefully at your sort-order of data-pulls when the data-pull is a subset of all the available data! Otherwise, you can get a crappy sample. Okay... get a little more on this pesky alternative server thing underway again. I know I want dropbox, so take care of exagear first. First things first, stop Debian from requiring 2 clicks on the keyboard to get a tilde symbol. This is apparently due to the "dead keys" feature of the keyboard selection. Just change the keyboard selection to a regular English (US) layout, under System / Preferences / Keyboard. Okay, also remove the dead key version, just to be sure... done and fixed. Next! Load Iceweasel and pull up my simplenote. Find the email Exagear sent me. Download the exagear-desktop-armv7.tar.gz tarball. Wow, that's big. Okay, it contains a bunch of virtual environments -- I get it. Thu Jun 23 16:13:27 EDT 2016 Wow, very intensely involved in the Pandas bounce work again. Totally haven't looked at the server stuff in hours. But I did... - sudo apt-get update - sudo apt-get upgrade - ...on both native ARM and inside the Exagear vm While you have a half-moment to spare, get Dropbox installed under Exagear. Thu Jun 23 17:49:21 EDT 2016 Okay, actually just spent about an hour with one of the property folks. That's good, but definitely keep perspective on forward-progress issues. My final thing before I head out today is going to be to get the MiniConda install going under Exagear on my CuBox. Okay, that went way too easily. Now, try installing Jupyter Notebook! Had to exit and reload the bash shell to get the paths correct. Okay, it's got a huge list of stuff to pull down and install, and I gave it the go-ahead. I'll wait through the downloading (I'm expecting, the fastest part), and then I'll go home during the software build. That'll probably take wayyyyy too long. And frankly, I'll be lucky if this works at all. Okay, it's now extracting the packages. Yep, slowwww. But I'm going to watch for a little bit to make sure no errors occur. I'll commit-and-push, and whether this even works at all will be something I won't document here until tomorrow morning. Thu Jun 23 18:13:36 EDT 2016 I take that all back. Jupyter Notebook installed on the CuBox, and is running right now in front of me in Iceweasel! -------------------------------------------------------------------------------- ## Wed Jun 22 21:17:33 EDT 2016 ### You Need Pandas to Understand Panda... They Did Appear Together I believe this will be another night that I urgently should use to clean up at home and get myself organized in the real world, which is becoming so truly urgent, but still... but still... my earning capacity... my still relatively new job. My proving myself, and my segueing SEO to Data Science being so urgently felt, in my heart. My last twenty years of my life can be spent either as something of a dinosaur following in the footprints of my father who stayed in a drying up industry, until it was too late. Don't be as risk-adverse as your dear ol' Dad. Take some leaps. Leap #1: those STILL in the SEO field after all these years of the writing being on the wall, are preparing for some sort of transition now, for better or for worse. Many are probably hoping for better but expecting the worst. Google doesn't have to keep playing by its old rules. All it has to do is keep its users happy with their Google-experiences, and continuing to come back to Google platforms -- even if those Google platforms are just sandbox'd apps within other platforms. Sure, platforms are nested. Systems are nested. Everything is nested. It's virtual machines all the way up and all the way down. Don't be surprised, nor pass any judgement about which is the "Real", because in reality, you, me and anyone else inside the system really has no f'ing idea. And it's thoughts like this that drives me on... onto Data Science. I'm in the honeymoon phase, where I'm still capitalizing it. But then again, I still capitalize SEO still after all these years. Those of use in this sordid lot known as SEO's who haven't jumped-ship for Social Media or for SEM/PPC (paid search) before that or for Programmatic more recently are the die hards or last to jump ship as all the old tricks of creating buoyancy for garbage stop working. Now, your ship has to be seaworthy to stay afloat on the ocean, with that surface representing all that is visible under the eye of Google, versus all that is hidden in the deep, blue sea. Seaworthiness is quality, reliable stalwarts of providing visitor happiness. Google endeavors to glean your happiness subsequent to searching. That will get fed back into the Machine Learning algorithm, of course. Oh, to be Google. But alas, we are just us. But little ol' us can as the correct big questions. And we can often come up with some very good insights, that make all the difference. The universe is infinite, and our choices are infinite. Data Science helps us narrow down our choices to more reasonable infinite sub-sets -- setting some boundaries or parameters, such as it were. It helps us deal with precise sets of numbers as if they were mold-able materials without losing our sanity. It helps us do what we want to reach for a loop to do every time, but now see that this is precisely why alternative APIs exist -- because sometimes the existing models, as flexible as they are for general procedural programming, there's better for expressing data transformations against sets. Pandas does just that -- as, I suppose, NumPy did before that. -------------------------------------------------------------------------------- ## Wed Jun 22 09:59:29 EDT 2016 ### CuBox Not an x86 Replacement Take a stab at getting your CuBox set up. That set up at work will be really cool! 1, 2, 3... 1? Make sure you have the sdcard, card-size adapter and computer with an SD slot. Check! Woot! Okay, step 2 already... find the CuBox i4Pro download page... https://www.solid-run.com/downloads/ Wow, okay. Real hardware may be getting as easy as the cloud. Also, make sure you look at http://docopt.org. There is a standard for help screens in command-line programs. Wed Jun 22 11:20:58 EDT 2016 Okay, actually showing Paige the process of setting up the CuBox install. Wed Jun 22 16:02:43 EDT 2016 Ugh! Dropbox not compatible with ARM processors! Bought and downloaded Exagear. It's a bit overkill, but better to throw money at same-code than refactor new code. Dropbox on the CuBox will be infinitely useful, anyway. Wed Jun 22 16:40:33 EDT 2016 It's a bit maddening, but things aren't running under native ARM Debian. In particular, psycopg2 can't connect to the corporate database through the JDBC connection (surprise, surprise!). I hate that processor architectures other than x86 is as nasty a divergence as being on Windows. And I was essentially forced to spend $35 today to make ARM hardware look like x86 hardware. It's a virtual x86 machine inside the CuBox... actually, pretty impressive. Not thrilled it was necessary, and that Dropbox predicated it. But it will make things more controlled conditions, assuming that the x86 virtual machine is fast. And so, it's nested virtual machines... first exagear, then virtualenv. Funny! Hope I get it working today. Don't want to think about it again. But I get the feeling that I'm running a completely parallel and same-looking copy of a Debian 8 system within another Debian 8 system; the outer one being ARM and the inner one being x86. This is the sort of crap I was hoping I would not have to do anymore. All this to GET AWAY FROM virtual machines on my laptop! Ugh. Think about x86 processor architecture in your future microservers. For reports, don't forget: - pip install httplib2 - pip install google-api-python-client - apt-get install libpq-dev python-dev - pip install psycopg2 Wed Jun 22 18:30:27 EDT 2016 Ugh. Still hitting problems. Anything having to do with running layers of VMs is sub-optimal, and having been cornered into simulating x86 on quad-core ARM hardware in order to get Dropbox working was just too much today. And some of the steps in generating the report take soooooo long, as to almost be not even worth it. The CuBox is probably good for a bunch of other stuff, but for generating these reports, maybe not. I need something a bit more beefy and x86-like. Hmmm. Maybe bring in some different x86-like hardware from home. -------------------------------------------------------------------------------- ## Wed Jun 22 09:46:56 EDT 2016 ### The Observation Game and Invention Ah, right where I left off. Had dinner with Adi last night. Went to Ichi Ban. Feels like I wrote this already. Un-pushed edits from home? Yeah, I think so. Merge in my future. Beginning gradually to think-in-git. And I'm really just coordinating with my own work on the same file from other locations. No big. Entries are entries. If any get out of order, my cool new tech will work it out. I'm even now seeing applications of Pandas on this journal. That's going to be interesting, as I transition myself from SEO to Data Science. It's becoming a pretty clear-cut move on my part. I'll never want to be an actual software developer, because I'm never going to want to code for anybody but myself... as in, I'm only going to want to be implementing ideas of my own design, for my own reasons, and for my own inventions. I love having inventions. It's really starting to dawn on me that with Github, the primary thing that's really tweaking me out about my work, is that I'm an inventor. And I'm perched at that point where I can really start to pump-out quality inventions and media surrounding their acquisition, improvement, and use. That's me. That simple. Loooove that feeling. Don't know why I couldn't make myself a Thomas Edison from early years. I'd have loved that shit. Sigh... anyway, no regrets. I love who I am and what I am right now, and I'm very important to one particular little girl, who isn't going to be able to be Daddy's little girl forever. I want to equip her for the world I see coming. It's not going to be all Terminator John Connor, but it will be somewhere on the scale between the normal way to bring our children up, and tikkun olam required. Let's call it casual altruism. If you can improve the world somehow, while you make yourself rich (and OPTIONALLY famous), then do it. Funny thing about the Internet, fame often comes before rich, and does not always result in it. So, be careful. Invent and protect yourself, based on the realities you see and believe to be true, through the repeated and as-unbiased-as-possible observation game. Don't believe what you're told. Believe what you see. And verify that a couple different ways as well. Look for the cracks in things. If you're not, others are, and the rewards for finding those little cracks in the way things work should be yours, and not some other randomly observant person's. -------------------------------------------------------------------------------- ## Tue Jun 21 20:46:23 EDT 2016 ### Courier New, The One True Non-Proportional Font Wow, working in Courier New feels just so much more grown up than Lucidia and those other terminal-friendly non-proportional fonts. These are known as mono-weight fonts, and even the lower-case i is as wide as as the thickest upper-case W. Crazy, no? But way-cool for tablular data. Had dinner with Adi and Rachel at Ichi Ban. Wow, is that place delicious. My body has some protein to digest of a diverse array of creatures. Mmmmm. Carnivores. Well, I know we're omnivores, but it sounds less intimidating. Okay, tonight? Sleep early. Maybe test my system. Tempted to document it. Got out a quick email to the boss on an important question. Kept my word. Always very important in these situations, and I did it. Self-high-five. Next? Oh, hmmm. Chill out and relax. You earned it. Wow, you're on the verge of doing your first real Pandas project. -------------------------------------------------------------------------------- ## Tue Jun 21 09:38:42 EDT 2016 ### First Pandas Project Now Underway Didn't get in as early as I want. I should be walking in here a bit before 9:00, and I should be leaving home a bit before 8:00 and I should be going to sleep a bit before 10:00. That would all be very lovely. But I have to guarantee I can "get into the zone" on both the home front, and the work front occasionally. Over the weekends, I'm "in the zone" with and for Adi... EVERY WEEKEND, no compromise. I did some really outstanding work on Thursday, Friday (which turned out to be work-from-home and vacation days) and Monday. Yesterday, I knit it all together into an actual solution. I need to make GoodSheet (rename it HappySheet?) into an easily reproducible experience -- much more so than Pipulate ever was. But Pipulate becomes the function repository. I'm creating just something of a batch-processing convention. Could it work *in conjunction* with Pandas? I'm reading about numpy ufuncs, and it appears they only apply to scalar values. See if you can start getting both Paige and Marat into Jupyter Notebook. Just establish it's name as Jupyter Notebook with them, and try to "edit" IPython out of the discussion. Ugh! All the internals of the program still use IPython, even the file extension ipynb. D'Ohhhh. Okay... hmmmm. Plan your next steps. I should really hang out on the 11th floor more, and make myself more accessible. But ironically, I'm on the "quiet" floor where I can do some real development work, hahaha! So, think through today. Your biggest immediate thing you need to confirm is psycopg2 under jupyter notebook. 1, 2, 3... 1? Clearly it's seeing whether or not that's available under conda. Yup. conda install -c anaconda psycopg2=2.6.1 But before I do that, I want to establish the before and after. Get a script onto my machine that should run. Tue Jun 21 10:38:09 EDT 2016 Just went to the Menu meeting, and then met with a Production guy about some title tag override questions in news (was it overridden?). Okay... think! Barebones SQL stuff exits. It's in my virtual machine that I almost never use. Hmmmm. Get that new hardware running! Kill a few birds with one stone today. 1, 2, 3... 1: Before & after with psycopg2. Before... hmmm, barebones.py I believe is in my reports repo. Okay, done. Now get a new .ipynb file in the /core repo that does EXACTLY the same thing as it does in the serpchiver folder. -------------------------------------------------------------------------------- ## Mon Jun 20 16:51:17 EDT 2016 ### Putting Latest Work into Proper Repos It's funny the difference a particular font choice makes. And Courier New is really growing on me in this high resolution usage. It's one of those fonts that looks better with higher resolution. Bold looks really terrible, though. This is a font whose wire-thin design really works for it in high res and high contrast. Okay, enough about fonts. This is my first real breather in awhile. I want to "can" all my good work from over the weekend, and now also from this morning having to do with running batch jobs. Look over my work carefully, and decide what needs to be promoted to where. -------------------------------------------------------------------------------- ## Mon Jun 20 09:08:05 EDT 2016 ### Now, I've Got a Machine Gun. Ho Ho Ho. Keep focus and stay sharp. Don't get flustered. Work your way through this smooth. The Wolf from Pulp Fiction. Be that character. Go! Okay, I flattened the source list. I also ran my battery of tests. Now, make a copy of serpchiver. Make sure the copy runs well. Too. Check! You are in the home stretch. Your Pipulate functions should migrate smoothly. Okay, this is going to test your kung fu skills. Let's focus and accomplish. Establish that you can read the source data... FAST! 1. Rename chunkulator to batchit and test against real sheet. 2. Fix last range of chunkulator to do the correct range. 3. Move the linklist function over. 4. Restrict the chunkulator to process only one row 5. Make each... Lots of clever, fun, done. Amazing! Now, I've got a machine gun. Ho ho ho. -------------------------------------------------------------------------------- ## Sun, Jun 19, 2016 11:41:52 PM ### Brand New Kung Fu Just got home. Interesting. I love how this looks on Windows 10. I should really get Anaconda running here. This is my first real time since Friday when picking up Adi that I'm "free" again. I love being with Adi, but being with Adi is being there, in the moment, heart-and-soul, no time for getting into the zone with something else. Adi is the center of the zone. She is the source and destination of the flow when I am with her. Everything else is competition. Everything else is keeping my job and performing up to snuff. Not easy to slip inspired work requiring that zone place in between forced later bedtimes and waking up after me slivers of time. Nope, that was the way of the past. Brave new day. Get some sleep. Wake up early, leave early. Use your new tech. Do that Kung Fu you can now do. Jupyter Notebook pretty bad ass kung fu. Okay, re-state for your own edification the amazing thing you did on Friday. And don't let it be lost on you, Adi was crying as you were driving away. But Adi, this is all for you, and it's not a big-build thing anymore like Pipulate. It's the least lines of code that can do some awesome things, and be highly readable, and not OVERLY deconstructed in a way only a DRY person could love. Instead, go for obvious and powerful and elegant in its very obviousness. Lay THAT as your foundation system That's GoodSheet. -------------------------------------------------------------------------------- ## Sun, Jun 19, 2016 4:27:54 PM ### Adi Has Been A Little Sad Still in the Catskills, still on the Spring Glenn side. Just paid ahead by a cycle. Make things a little more lovey dovey among the lawyer board member and himself. Just as general advice, don't make ultimatums, and if you do make ultimatums, make sure you're not so weak that if conditions change so that the ultimatum no longer applies that you don't "get it", and say, hey let me talk to the board and see if we can't just lift this extra little bit of petty punitive knife-twisting now that we know you're going to sell. But coincidences happen and changes of heart happen and so now I consider myself as having a full range of options and think I might still have a little fun with it still. Because no matter how bad it's going, if you can't have a little fun with it, then why bother at all. Just make sure that if the one you're having fun with turns out to be a little bit more or a little bit different than you were expecting, that you're smart enough to recognize early and adapt. Or else, you're going to end up looking, at the very least sill, and at worst perhaps rather ridiculous, that someone can so easily play a professional player. Yawn! Just don't have time for that shit. But I do love to write and capture ideas. And was this one crazy Inwood gathering at the Catskills this weekend, amongst the staples of the neighborhood, Jason Minter and Manny... uh, need to know his last name. But it felt like a who's who of Inwood staples up here. Going to play Lego with Adi... I'm going to make sure her decisions factor a lot into our decisions. But God, how I would love to get rid of the blood-letting monthly expenses. There's only so much in me. Yes, there may be some sadness, Adi. But let's shape this experience in our mind. What would be going on if it were an Inside Out-world move -- our lives, that is? Sandburg creek... ahhhhh. Inwood crew. Jules and Clem hanging out with Adi at the same time, along with Simon and a few of the gang from the attempt at the home-school co-op. -------------------------------------------------------------------------------- ## Sun, Jun 19, 2016 12:09:22 PM ### Climbed Up On Roof to Rake Off Pine Needles When I go back into the office, it should be wielding my latest secret weapon. I would really love to take some time today to work at it. And I don't believe I'll be, as I heard it put once, "ignoring Adi" trying to get some personal/professional work done during the precious weekend time, which is some of the only time during the week I get to spend with Adi. Well, it's not true. She's interacting with the real world when her nose isn't in the iPad. And at the current moment, I hear Adi saying to Simon: "Maybe we should interact with the real world...", and they turn back to the iPad. And Adi explains wibwab's to Simon, but also that it requires Internet, and that we don't have very good Internet right now. Adi's saying maybe he can start at the first level, and that he might not believe it, but she defeated the big boss in wibwab. Adi's asking if she can have it after he defeats... something or other. I sense my exclusive access to my SP4 being in jeopardy. Find a place to casually retire to not being notice for awhile. Let them relate to each other. I still have to clean the pine needles off my roof, and it's all set up. Now is the time, before it gets too blazing hot. -------------------------------------------------------------------------------- ## Sun, Jun 19, 2016 12:04:02 PM ### George and Harold, Simon and Adi. @Catskills Courier New Regular, 16 point on a Microsoft Surface Pro is a nice font to look at. It reminds you just enough that you're human, and your connection to the human world of distinctions being necessary, but the vast majority of information being driven into the subconscious handling mechanisms of the brain or the mind or the meat computer, or whatever other silly distinctions or labels you care to use. These are the times, and this is the place. I hope I push out a bunch of my recent videos. Here comes Simon and Adi. Coding, coding, coding. Let them see me code. No shame in seeing how fast I go on a keyboard, whirrrrrrrrr, look at him time! But they're here for the Legos and iPad in the cool. It's heating up out there. We have a pool visit in our near future, but I'm going to have to get her out to Walmart for a bathing suit. -------------------------------------------------------------------------------- ## Sun Jun 19 09:54:04 EDT 2016 ### Live From Granny and Pop Pops! I have a journal entry on my Microsoft Surface Pro journal repo clone directory, but that computer has the charger here at the Catskills during the weekend, so Adi is monopolizing it with her morning fix of Teen Titans Go! I'm so glad she's getting indoctrinated into the DC Universe super hero culture. It will be interesting to see if she becomes a DC kid, a Marvel kid, or absolutely no allegiance, but with distinct understandings of the separations between universes, both in reality (ours) and in their story-lines (occasional cross-overs). I need to update this machine's .vimrc and spell dictionary to use the Dropbox ones. I can "feel" the divergent environment. Funny how I'm knitting together my own timeless (forward-in-time) mostly text-based environment, made even more durable by leveraging the browser (Do you say "Web browser" anymore? And if you do is it capital W?). Well, the funny thing is with a merge coming up, and starting to get a real appreciation what happens in the merged, but conflict-resolutions-needed, files, then you can proceed without git-knowledge-deficit fear. Just remember, git is made by the same guy as who liberated *nix OSes for the masses. Well, BSD almost did that first, but because of intellectual property battles of the ridiculous companies that recognized the value of Unix, but did not recognize its inherently wants-to-be-free-and-open-source nature went to war to keep all versions of Unix paying homage to them... literally paying licencing fees for their use of Unix-like stuff, or be liable for violations that would incur lawsuits, and a generally anti-innovation stifling environment. Google SCO Unix and learn all about that nonsense. The SCO wart on technology's history and war on innovation turned out to be not necessary at all, as the crew behind BSD got busily to work black-box reproducing (permissible under IP-law) all the parts of BSD Unix that were under suspicion of stolen-code violations. Soon, there was FreeBSD, a version of BSD Unix that simply worked around the bits that SCO was crying like a baby with its blanket being taken away. So, it was just an embarrassment when the whole lawsuit was rendered silly nonsense on TWO fronts. Not only did BSD become truly free and open source after all, but also a certain Finn looking for a project, and started hacking up an alternative to, what as far as he knew was IP-burdened stuff, his own black-boxed version of Unix, humbly named by a certain git as Linux. And so a tradition born of Ken and Dennis hacking up a giant bird-flip to Multics, an OS only corporate overlords could love, with Unix, an OS only tweeky nerds destined to be computer "scientists" who might even someday own their own computer (non-timesharing) could love. Steve Jobs loved it, and so now OS X is actually a prettied-up version of Unix. And that Finn named Linus Torvalds loved, who black-boxed a good chunk of the thing, so that this other thing that reverse-engineered another good chunk of the thing (the Unix command clones known as GNU from Richard Stallman) could unify to become what should probably be known as GNU/Linux, but which most people really call Linux. Those popularly known Linuxes are also a desktop component, usually Unity or GNOME combine with GNU/Linux to form some sort of Mac/Windows wannabe clone. Commit before battery dies. -------------------------------------------------------------------------------- ## Sat, Jun 18, 2016 10:17:10 AM ### Back to the SP4 Now, I'm on the Surface Pro, at the grandparent's house in the Catskills. Think things through. Okay, tmux is working, but I'm still favoring full-screen. I've got to learn to be comfortable doing my work here on the SP4, because even though the Mac laptops in my life are really the best platforms for the video stuff (ScreenFlow is wayyyy better than TechSmith Camtasia Studio), this computer is really very nice too. There's all the drawing I can do on it (touchscreen) if I keep the really nifty pen in the same place as the computer more often. Also, even without virtual screens (which Windows 10 does have) the 3-finger swoosh on the touchpad switches between full-screen-(ish) apps super-well. Very smooth and non-disruptive. Interesting contrast to the animated virtual screen ribbon effect on Mac OS X. I'm still looking for my favorite font. I'm going to try Courier New for a little while. Looks so sharp on this screen. Adi's asking for the other Catskills house, so save, commit and push! Sat, Jun 18, 2016 6:53:20 PM Funny how I don't always want to start a new journal entry. But Manny's here! And this is a continuation of the day. It's an amazing day, really. Going from Granny and Pop Pop land to this strange development, with Manny up to visit Jason. And Jewels and Clem and Adi all hanging out. In addition to Simon and the whole colony here. Wow. These are those memory days. And I may never my Note 5 phone back from Adi and Clemi playing Pandas games. The third Pandas in my life, haha. I should really screen shot that as the picture to go with an article that could go Medium.com viral with me redefining old school SEO as Data Science. Panda alg tweak of Google meets Pandas awesome data science give R a run for the money (free, in both cases), so a battle of ideals. That's pandas in all their forms -- a battle of ideas. And it's not all black and white. BAM! Oh, and that S-something burg creek today. What a treat. Real memories being made. Glad I got pictures. Phones, wow! A supercomputer in my pocket. I will benefit greatly. Enjoy the information that's at your disposal, and is yours to record. Play back locally, over and over, transferred from old medium to new as platforms and core assumptions about privacy change. The mere users of products will be reliant on the cloud products of clever integrated services of big providers, like Apple, Microsoft and Google. Our world reads like a really classic comic book. I mean, come on. It's old skool home computer hardware people gone cosmopolitan unix versus the folks who've got the latest round of computer service buying companies by the gonads, because their whole corporate infrastructure is based on your proprietary technology, and that makes us still infinitely strong, Microsoft. Versus the good old algorithm weaving boys at good ol' Stanford who figured out that the ebb and flow of general inquiry activity on the Internet could be monopolized by doing a better job than anyone else, adapting to change, getting whatever hardware and cloud experience you really needed, then work out your robot army to rule the world by 2020. The'll just refuse to drive you anywhere. -------------------------------------------------------------------------------- ## Fri Jun 17 16:24:08 EDT 2016 ### Test Suite Okay, it's 4:30, and that's just about time's-up given my Friday schedule. Still haven't done one wit around the house, but have made tremendous progress on the Pipulate 3 front... wow! How easy things can be when you just let yourself use the big tools, and not have those artificial restrictions of economy that I was putting on myself, because of... well, old skool thinking. It's always a good time to blend in new thinking, because things really do improve. Git is a new(ish) innovation, for example, and the Anaconda, Jupyter Notebook combo appears to solve the "things not working right for everybody" problem. Figures some folks would tackle this on a magnitude that humbles my work by comparison. Interestingly, I'm still way better on the visual front, and they just adopted an extremely vanilla (appropriate, I guess) look, which I believe is a default QT interface look. So, I have to hit this thing home. Be sitting on top of a bullet-proof machine gun when you get into work. You need the concept of a dbcache. Fri Jun 17 17:54:05 EDT 2016 I didn't do the shelve stuff, but wow... the test suite frameworky thing I did just now is amazing. Go check out my goodsheet and serpchiver projects over at Github. They work together. It's not going to be clear to anyone out there really until I start making videos about this stuff, but this is a firm grabbing of the reins of some pretty powerful horses. Shit, I feel the secret-weapon-i-ness gradually coming back. Wow, how did I miss you. I use tools better than most, and I love me some timeless tools. Jupyter Notebook doesn't qualify as fundamental as Linux, Python, vim and git themselves do. But shit, wow did some clever people put together one kick-ass pythonically webby hetero-runny REPL-y awesomeness, and I salute them. Now, a fast grab at shelves before the weekend begins. I will feel very good. Fri Jun 17 19:23:53 EDT 2016 Okay, I got done what I wanted to get done. I had my success assured moment. I can split jobs between my desktop machine's capabilities for small-scale, persistent but heavy-weight as far as Excel and GSheets are concerned, without really thinking about databases. It's just a filename in the repo that matches the tabname (check for spaces?) that is a native Python shelve object, bound to a very simple dictionary of lists that reflects all the prior cells on the row. Wow, that could be REALLY powerful. But for my immediate purposes, it's for arching raw serp results, and even the lists-of-links found on pages. All that stuff is to heavyweight for the spreadsheet programs when you get into the thousands of rows. That's an awful lot of field stuffing. But I just flow that on out to my hard drive in some surely optimized Python format, and keep it around for a little while to join back to data on its keys (think about what to use typically as the key) for faster-than-vlookups, and more accurate-than-copy/paste. BAM! Good job, Mike. Pipulate 3.0 is born. Make sure you can keep using that awesome logo! Work it in, and figure out the relationship between it and GoodSheet, and projects like serpchiver, and still even Levinux. Off to see the Adi! ... -------------------------------------------------------------------------------- ## Fri Jun 17 11:35:21 EDT 2016 ### Cut Out Tedium & Waste I have to fix things better and fundamentally on two fronts: - Home - Professional Secret Weapons In for a penny, in for a pound. Now that I'm using Jupyter Notebook, it's probably time I started using ScraPY too, for my crawls. I could even use it in combination with Pandas. Both are optional, as is pulling in the original list of data to crawl against. Fri Jun 17 12:26:12 EDT 2016 Just ate for the first time in awhile. Think stuff through more than you do. Do stuff better more than you do. Do more than you usually do... Including sleep and rest. Huh, how? Efficiency. Innovation. Look for space provided-for in the plan, if only you could cut out some of the tedium and waste. Layer things and connect them properly, but be careful of your dependencies! Cold-blooded reptiles are experts at saving energy. Mammals are spending energy all the time, but compensate with greater intelligence, emotions, and constant activity to keep finding more food. I've been coding like a mammal. I need to start coding a little more like a reptile. I think it's time for ScraPY. Oh wait! Nope, it's Python 2.7 -- reason enough to not incur the extra dependency and learning. Just transpose the code from Pipulate to run the job you've been running a lot recently, but without the GData API chatty http chatter overhead. I was writing per-row, generally. That's better than Tiger, where I was writing per cell. It's time to go industrial, while still remaining cute and accessible. 1, 2, 3... 1? So, then the question is Pandas or not. Again, I'll say not, but I'll keep it all flexible. The less code you write, and better you separate bits from each other, but more readily you can make it work different ways. Okay, so this job is: 1. Read the data out of GSheets. 2. Write data back into GSheets. 3. Chunk it for fast processing. Fri Jun 17 16:03:11 EDT 2016 Wow, made fabulous progress. Really happy with this bit of chunking code. I think GDocs might remain viable as a place to process this sort of stuff, so long as I get/keep a really good feel for how much data (really) you can cram into GSheets. It's the best place to read lists from, even in the thousands, and the best place to write correlated lists back out to, similarly by the thousands. Maybe up to 5000, processing rows by batches of about 50, and keeping superfluous columns to a minimum. In fact, one of the biggest tenants I believe I will have here is using temporary local uniquely named ad hoc databases, initially tied to the Shelve API, but at some point able to be re-wired to different back-end databases through the compatible Shove API. That way, after doing some per-row data collection that's too big to stuff into GSheets, I just keep the database local on my machine, in that repo's folder for awhile, until I'm really sure I don't need it anymore, then I can throw it out. And in the meanwhile, it will be WAYYYYYY faster than zipped and bin64'd data stuffed into spreadsheet fields, cool as that is. Implement this plan. Here's the chunky logic. Can't believe how terse and elegant this all can be. def chunkulate(sheet='longneasy', table='Sheet1', rows=1000, stepby=50, secs=0): # Batch-copy incremented integer to next column. Use gs.copycolumn() to set up. gssdoc = gsconn.open(sheet) # Check for bad filename asheet = gssdoc.worksheet(table) # Check for bad tablename whole = range(rows) chunks = [(x+1, x+stepby) for x in list(whole) if x%stepby == 0] for atuple in chunks: arange = 'A%s:B%s' % atuple cells = asheet.range(arange) for cell in cells: col = cell.col row = cell.row if col == 2: if cell.value: cell.value = int(cell.value) + 1 else: cell.value = val val = cell.value print("Updating: %s" % arange) updatecells(credentials, cells, asheet) print("Done") -------------------------------------------------------------------------------- ## Fri Jun 17 09:21:45 EDT 2016 ### Namespaces are good. Sent out emails about taking today from home. Added a namespsace for goodsheet. Good to set the conventions out well from the outset. Just as with np for numpy and pd for pandas, I want to help encourage gs for goodsheet. It is also available globally, and does not need to be passed into functions anymore. It is a global connection resource for the system, and is one of those things that can/should be handled that way for good recycling. Okay, brought a few functions over to goodsheet repo from IPython. Discovered Markdown in Jupyter Notebook. -------------------------------------------------------------------------------- ## Fri Jun 17 06:01:52 EDT 2016 ### Four Different "Modes" of Pipulating Okay, what I got was this: 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 44, 45, 46, 47, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, GData API connectivity issue 59, GData API connectivity issue 60, GData API connectivity issue 61, GData API connectivity issue 62, GData API connectivity issue ... 380, GData API connectivity issue ...and so on. It's actually still running, but it's the equivalent of "stormyweather" in Pipulate, and is something that's never recovered from on the same connection. When that happens I have to try to make a new gsconn object! I already experimented to see if gsconn can be closed, and it can't, so just try making it again. Stop the current run, lower the number of rows (so it's not running all day when you're at work), and do the next experiment. Fri Jun 17 07:41:37 EDT 2016 Okay, happier with the code. Broke out some functions. Fri Jun 17 08:31:40 EDT 2016 Ugh! That huge sheet I was processing yesterday in GSheets got bloated at only about 4000 rows, because of the field-stuffing I was doing from the linklist function and google serp archiving function. I had to delete all the serp archive data for sure, but I kept in the linklist function, which is a scrape of all the links from the site's on-domain links from their homepage. I had to flatten the sheet as a CSV export, bring it into Apple Numbers to paste in the missing data, and bring it back into GSheets flattened. And so now, I think I'm going to take today as that day to get some stuff done at home that I had hoped to use yesterday for. But it was all about parsing that list. I also have deeply internalized the issues of using my old system against large jobs, and started to internalize the awesomeness that is Pandas. It's time to get my approach to be more compatible with Pandas. I have several modes of operation to think about: - Impress'em - Small runs directly into GSheets. Archived pages, serps & extracts - KPI FYI's - Long-running jobs slowly accumulating int-like data in GSheets - DB Required - Long-running jobs accumulating big data, GSheet logs optional - Tail Eaters - Only recent data important. Tail eats itself, GSheet UI okay I've focused too much on the Impressum's with Pipulate, for in-person demos and ad hoc site crawls that do on-the-spot second-degree lookups, against FB likes and such. But it's time to accommodate pretty much any job that comes my way, with the details being wired-up however appropriate. Commit this, and bring those notes about different "modes" directly into the .ipynb code. -------------------------------------------------------------------------------- ## Thu Jun 16 19:32:48 EDT 2016 ### Long-running jobs maintaining OAuth2 login, take 1! Hmmm... okay. Let's see. Busy loop complete. Now, write out a number of times to a spreadsheet as a test. Forget the while loop. Go to a for of the same size as the sheet you create. Thu Jun 16 20:14:43 EDT 2016 And there it is, the tictoctask in my latest ipynb file in this journal repo. I'm actually getting proud of this. I can set up a task that connects to the Google spreadsheet only once every 10 minutes, and plops in a timestamp. Thu Jun 16 20:41:59 EDT 2016 I just had the most amazing talk with Adi about coding and programming. We went into Python and Google's Go Lang and concurrency and the concepts behind map/reduce (though I didn't call it that to her). And now... and now... wow. Just make a loop that does what frustrates you that the current Pipulate can't do... stay on super-long with one task. Go a step at a time. Start with this, and just keep it running for as along as you can... again and again. Go for a record. One every 10 minutes. Thu Jun 16 21:51:21 EDT 2016 Okay, I'm very happy with this state of affairs. I'll set it to finish by the time I'm waking up (~6:00 AM). That would be 480 rows. Go! -------------------------------------------------------------------------------- ## Thu Jun 16 18:01:02 EDT 2016 ### Controlling Alien Tech Through Python API Wrappers Okay, this will write a line into a Google Spreadsheet for every line in this journal. Wow, echoes of trying to do advanced stuff with XSLT 1.0... shit, Python is SO MUCH better for... well, everything. Sure you COULD parse one XML document on the left, apply a stylesheet, and output as many pages to the right as you like, on the xsl-document element, and worry about xalan vs saxon and most likely being tied to the Java world. But why bother, unless your problem really screams out XSLT? Instead, try Python 1st, 2nd and 3rd -- with 1st being Core packages, 2nd being 3rd party packages, and 3rd being finding some wrapper written in Python to bind a delightful Python API to something from some other world, like Lucene and Elastic Search. I'm pretty confident that Python could someday control alien technology, so long as they write us a Python wrapper. Okay, I have the meta journal tech first real baby-step occurring. I feel like it's a good time to commit, because of the state of the parsejournal.ipynb file. Well, let's see... hmmm. Next intelligent step is to just set tests going that make me WANT to get up and get away from my desk and do stuff. First, I should put in the GSpread command to blank the sheet I'm working on. We're starting right from the data collection in the first place, baby! Go ahead and integrate ScraPY crawls, or whatever you want. I thought of it here first, ladies and gentleman. Now that my everyday platform is so durn transportable, wow... look out world, I've got me some power-tool Python packages to ponder putting to a purpose... from the Windows, Mac or Linux desktop or laptop system. External servers to host some code on, or to bounce some some traffic off of are optional... but will certainly be possible and encouraged. There's no better IP to use than the one that's not yours, and the shuffleroo should be achievable with either one of two ways. There's that slick, consumer-oriented anonymous VPN proxy software. You install software like HideMyAss, and all your web traffic can be made to look like it's coming from anywhere in the world. Or you program your app to funnel all of its out-bound http requests through particular proxy. It's just a bit of re-routing of your out-bound requests to make their first stop through yet another Internet Gateway router, such as it is, so your personal IP is protected in the same way as your machine's IP when you surf out to the Internet through the company's network. Chop! Off comes your info, and on goes what looks to anyone sniffing as just another consumer end-point or small business. Encrypt your requests as https. Cycle through different IPs, either at the VPN software level or work a list of anonymous web proxies that you receive as text every night for life from HMA, for a very nominal fee. I can just go ahead and talk about and develop this stuff, mostly for my own personal use, of course. But it is indeed some Good Sheet. Domain squatting is done. When I'm ready, I'll do something with a clever domain. Next? What next? Hmmmmm, this is some really special, inspired time. Do something crazy-clever. Improve your life permanently and forever forward. Treat your desktop like a server running a scheduling service. But let that service actually BE your local Jupyter Notebook... hey pssst! Jupyter people, how about this: Take a smart scan-and-save snapshot of the Internet landscape, collecting your KPI's du jour, such as it were, in some highly personal, highly controlled, highly automated, yet highly visible, accessible, and indeed, grokable ([Esc]bzgi continue typing. No reason for grokable to appear as a misspelled work in vim, so I just popped it into my spell-checker's okay-word dictionary, which I keep in Dropbox, and by extension, in Github. It will be interesting to watch it grow on github. I have to remember to commit and push the vim directory occasionally on any of my Dropbox machines. Vaaat a workflow! It's almost as if it were the Amgia days... ironically, on Apple. Pssst! Hey, Apple! Why don't you allow each app decide whether it prefers to be opened in Full-Screen mode. I would LOVE this for the terminal. I use the terminal almost exclusively in full-screen mode now, and the Mac's implementation of full screen and "horizontally"-sliding virtual desktops and how it all works together so elegantly, to give me just the right big fat fonts on my screen that my tired old aging eyes need. Ohhhh, tomorrow after work, I go meet Adi in the Catskills. I have to advertise the place for sale more. I need to figure out what I owe in co-op maintenance fees. Once I sell the Catskills place, I can focus on the apartment. I can't focus on the apartment more, because we're not here on weekends during the summer, and I'm not going to short-change Adi one wit of weekend time. Fucking lawyers. They should try coding something original. Go work on some open source law system that floating cities can use to modularly assemble their system of government, or some equally clever and worthwhile creative passtime. Don't even think of baiting me into legalese self-affirming to nobody but the lawyer jive talk. BAM! I love knowing what's important, and keeping everyone and everything in their proper sense of perspective. Vague enough? Clearly. Move on. Just move on. You do your thing. They'll do theirs. I love what I do, and I try to share that love with those around me... generally. And that's all that matters. NOW. This time. This gift. I have to improve my already really amazing processes. The concept of a config file go away. All config lives in essentially "live" Jupyter Notebook code, stored on disk automatically and intermittently, as .ipynb files. This is nice, because there's no mental abstractions separating the numbers you're providing with what you're going to be doing with them. Everything is in context. Few things are really hidden, and those that are are just really a text-file load away. It starts with installing Anaconda to get Jupyter Notebook installed along with a whole bunch of wonderful Data Science tools. It's time to go all data science. But just because you're using Jupyter Notebook, doesn't mean you need to leap right into Pandas, if that's not really what you need right away. And I don't think it is with me. I need to run long-running jobs. That's job #1: Long-running jobs, that don't stop because of oauth2 authentication issues. It just has to be able to keep running, reliably, and over sustained days -- continuously, if there's no reason why not. Essentially, I propose you keeping a 24x7 command-line terminal window open, from which you ran jupyter notebook (clever way to get people to switch -- making them type it to run it every time), and one Web Browser window, with the active-running Python script kept open in a tab all the time. I will have to learn the ramifications of such things as closing the last Jupyter Notebook tab. Clearly, you can't close the command-line, least the task stops which it is bound to. But you MAY be able to close the last Notebook tab, then just log back in by visiting http://localhost:8888/tree I do believe that you can't visit a currently "open" file. Hey, I guess open through a web browser, with some sort of file-lock system over a web-tool-utilizing browser system is just as good as opening some native file with some native software that "opens" the file into the software. What's opened after all, but for a copy of it being moved from long-term memory-storage to the closer area of immediate-experiences memory that's always floating around from that day's accumulation of experiences, until sleep can sort it all out. You need sleep to learn. You can deny yourself sleep here and there to push yourself hard, but you have to occasionally catch up with your body's sleep demands, or you'll go stupid. Sleep makes you smart. Get enough. Thu Jun 16 19:20:26 EDT 2016 And this looks like a good time for a commit. There's no reason to burn the GoodSheet template on what happens to be my current main for testing. It should never reach that function to even get into memory, considering the busy wait-state of the current main. One of my first uses of a while loop. Also, it's expected to just keep running, and for a heartbeat, I can clearly send a period using the new print parameter, end. So, setting end equal to empty, the print statement continues on the same line as the last print statement used when writing to stdout. Hmmmm. Yummy hackable. Haven't enjoyed this sort of discovery so much since I first learned this sort of stuff in Amgia Script, AREXX over dial-up Internet and such. I didn't realize stuff reached this level of awesomeness in the FOSS world. Jupyter Notebook on Python on a modern desktop computer is an awesome proposition. I can feel the next generation SEO stuff percolating. I'm first there in the noosphere... baby! Sure, others are using it here and there, but I'm going to take it to a whole new level. Okay, go do it. Now that you have a ticker and an output stream and a timestamp, put it all together into a heartbeat busy loop. First, clear the sheet BEFORE the loop... but first commit. -------------------------------------------------------------------------------- ## Thu Jun 16 12:33:14 EDT 2016 ### Today Was a Work Day, But In a Good Way Ugh! Hit by the unexpected. The SERP-collection directly into Google Spreadsheets has become too much data, given how I archive the results. Thu Jun 16 14:43:57 EDT 2016 Just had my 2:00 PM call. Haven't done anything at home at all yet. Ugh! Been doing work, actually... but Pipulate 3 -- or maybe GoodSheet -- not sure which, yet. But not by creating the new code, but rather by butting heads with one of these more SEO-eqsque projects carried over from LAST WEEK! Thu Jun 16 16:55:01 EDT 2016 I'm going to stick to my plan to tackle the apartment today. It's just that I ALSO put in a good day of work, and will CONTINUE to put in inspired work on my GoodSheet or Pipulate 3 product. I'm leaning towards GoodSheet, because it's much easier to explain to a Jupyter Notebook user than Pipulate. Hmmm, need to start thinking about a logo. Despite the pains of such a job as the one I just did for my boss, it really serves to reinforce all the notions that I need to be thinking about when I start the next version of my system. It also impresses me with the profound utility I actually did bake so well into the old system. The only thing wrong with it is not having a good strategy for backing off, and indeed maybe even perhaps re-connecting with another http call after a time. It seems to me that 2 things: 1. I took the pressure off of myself to use my new system for the current job. 2. The old system was able to churn its way through it, eventually, and that's pretty incredible in and of itself. 3. Having been through that experience, all the important issues to think about are really fresh on my mind. 4. It's really urgent that I capture that, fleshing it out here in my journal, such as it were. DON'T go head-first into coding. 5. With the pressure off of yourself, maybe it's time to use parsing your meta journal. Maybe the YouTube integration. Hmmmm. 6. With the pressure off of me, maybe I should concentrate on some areas and take some approaches I would never have otherwise. Ah HA! I need a benchmark job... I need a single, looping, controlled-conditions, API-hammering, resilient, backing off and retrying, inherently self-logging, due to its success at connecting to something that can record an event. In short, you're going to do something that can run with the sustained reliability of what you're doing with the Pipulate scheduling system. Okay, keep that nearly last job (if it doesn't break) running in the background. Keep checking on it. Make sure it finishes. And extract your learnings RIGHT HERE AND NOW. Your apartment stuff may be your immediate time and money, but this stuff is your career, smarts on a day-to-day basis, earning capacity, educating of your daughter, and earning capacity. I've got my priorities straight. There's no more important thing that I could be doing at this very moment than using my inspiration and hyper focus -- usually wasted on my subway commit -- directly into thought-work and coding instead. I just noticed that when you start Jupyter Notebook multiple times from the command-line to get multiple separate tabs in your browser, you also get different port numbers! First, 8888 then 8889. How clever! Wow, okay. Next steps? A LOOOOONG sustained connection that I can always be testing. From time import sleep! Done. Okay, this looks like a good time for a commit. -------------------------------------------------------------------------------- ## Thu Jun 16 09:41:21 EDT 2016 ### Thinking Through Premises of New System - Information wants to be free and open - Information wants to be expensive and proprietary - The Business Models In the Middle (tension, vibrating edge, FOSS companies) - Gestation and incubation periods - Competitive pressure to grow up too fast and do adult-things before ready - Being sheltered by parents (the prior generation) for long maturation periods - The occasional advantages of a long runway and heavier plane - Heavy-weight approaches (humans) versus lighter-weight approaches (reptiles on down) - Strategies that are adopted accordingly, and the connection to CISC vs. RISC processors - Sticking to the well known, tried-and-true paths versus risk-taking and innovating - How much predisposing should you expose yourself to - When and why should you deviate from best practices? - Do you do it at the program coding level or at the data level? - What's really the difference between running code and the data its manipulating? Okay, it's going to be important to process these Pipulate-like jobs from Jupyter Notebook in parallel. One job doing SERP collection, and another job doing stuff on the SERP data collected so far. SERP collection will be bound by how fast you can trigger off the requests without triggering the captcha thresholds. Also, I cold round-robin or randomize the requests to go through a variety of different anonymous web proxies. In fact, I could just cycle through the known good ones, and once optained, do 5 or 10 and then cycle onto another IP. So many good approaches. So many things that should be incorporated in at the high level of the code (not hidden in the nested granular interiors). Alternatively or maybe also, such logic should be transparent to the use of the system, except maybe for knowing its going on, and being able to change a few parameters, like how many requests or what time-duration it should use each IP for, before it moves on. Today is mostly about doing stuff around the house. But it's also about Pipulate 3. I will be going back and forth. Think through these architectural issues surrounding Pipulate, especially concurrency when needed. You can always create an entirely new instance of Jupyter Notebook by simply opening another command-line, cd'ing into the serpchiver directory (how it's named, currently) and type jupyter notebook again. You'll have another shell window with the running console output of notebook, and anther tab in the browser, presumably with its own separate Python kernel and virtual machine that can be restarted separately... ooh, ooh, try! Confirmed. Woot! Bodes very well for simultaneous tasks, hitting the same or entirely different spreadsheets, optionally working together with Google Spreadsheet ImportRange and VLOOKUP's, which seems to be a pretty winning combo, if not a little difficult to wrap your mind around (VLOOKUPS, in particular). The trick will be to do as much in Pandas as possible, and then insert cell ranges back into location with batch updates with GSpread, as many rows-at-a-time as makes sense to optimize speed-of-updating the entire worksheet. There will be some algorithms and calculations there. And today, my life moves forward in a major way on 2 fronts: - My new SEO Secret Weapon (in a publicly consumable form) comes online - The extreme disarray on the home front gets ordered for more capability Okay, if I do this right, I can connect some amazing-to-connect dots. As crazy as I would be to do this undertaking WITHOUT Pandas, there's so much I can do immediately... well, without Pandas. Imagine if I got IPython Notebook to install and run on Levinux as just part of the Python 3 install, and I got it running in webserver mode, using the QEMU console for the back-end IPython server output, and of course the native web browser on the host machine as the way to interact with it. It may not be the speediest thing in the world, but it would certainly illuminate the path... maybe set the stage for others to do those optimized QEMU binaries for text-only Tiny Core Linux on each major host OS. Wow, imagine if I made the benefits of Levinux mainstream and sexy enough to attract some major attention. Just replace how you were wedging Pipulate into Levinux with Jupyter Notebook. That's HOT! Okay, get a coffee, and then think about those tech pages in a parallel Pipulate task. I can still use the Pipulate servers that have their IPs catcha-banned from Google for other tasks, like homepage crawls. Go find that old document that has the Scraper pattern for that particular job. Found the document. Nahhh, I don't need to start from that work. Just think it through from scratch. That's the thing about the right tools... every project feels like it's "from scratch", but really isn't, because your component parts are exactly at the right chunkiness of abstraction-level, so that you're never writing too much, when you "start over". It's just such a pleasure, muscle-memory obvious, and fine-tunable to the peculiarities of the particular case at-hand, that you don't have to maintain your own "excessively customized" libraries. Those DRY folks are DRY for a reason. WET is more fun. 1, 2, 3... 1? Create a new Spreadsheet... and save it... hahaha! The necesessity to SAVE a spreadsheet from Google Sheets before an API could interact with it was zeroed in on by dipblip as a weakness of my system. Is actually EXECUTING the Create Table command that you designed for SQL before you can interact with the table a weakness in SQL... haha, dumbass. -------------------------------------------------------------------------------- ## Thu Jun 16 07:50:45 EDT 2016 ### The Tale of Two Pandas: Loosely Structured Meanderings as SEO Transforms Into Data Science There's a reason why the field of SEO is always reading it's obituary, while the field of Data Science can't fill its open job positions fast enough. One is quite literally being transformed into the other, as Google ups the level of smarts you need to stay in the field. And this happens to dovetail perfectly both with what I need to do with my career, and with what I have to do with my daughter's homeschool eduction -- which I'll cram into weekends with "recreational" activities. But these recreational activities will all be aligned to producing massive, wonderful data-sets to analyze, derive actionable findings from, and get rich and make sure Adi's future is financially assured in such a way that "frees" her to pursue her dreams, regardless of whether they happen to align with Tech or not. Odds are, everything aligns with tech and data more, rather than less -- at least, if you want to optimize, stay competitive, become best, and always be able to work less and earn more. At least, that's the dream. Let's start with an observation. Two Pandas met in the woods, and sorry that I had not befriended both, I got to know the one increasingly more traveled by, because I'm looking at the data, and I'm not stupid. An SEO is being chased by one Panda into the waiting arms of another... Pandas -- the Python Data Analysis Library. The field is transforming, and you must transform right along with it... mastering the very process of data transformations, so that you might naturally and intuitively extract those critical key findings that used to be "first-degree" derivatives of a limited number of known factors, but today might be the result of multiple unknown factors, many of which you have no first-hand data for, and must carry out data-collection tasks against the its side effects, and intuitively "reverse" in an attempt to find causation, and optimized courses of action. It's no small coincidence that my path is leading me towards the Python Pandas package for data science-type work, when the changes in the search engine system at Google has also been nicknamed Panda. They both have a lot to do with data, but the later was the circa 2011 re-shaping of the data in search results to stop rewarding data manipulators, and the former is a tool to make manipulating, analyzing and making use of data under the Python programming language much easier. It's fair to say that both have been transformative of their respective industries. The former Panda has already had a tremendous impact on my career and life (an attack on the long-tail publishing strategy) and the later is about to -- figuring out what to do now, as Google's general intelligence-level increases. Well, not the least of which is repositioning myself either OUT of the field of SEO, or completely redefining what SEO is. And the later sounds more likely, since it's a keyword that has tons of ready-to-follow sheep as a built-in audience hungrily waiting panicky information-starved pups just waiting to sheepishly follow an insightful, transformative, and well-written article by me illuminating the new age of SEO, and how we all must become Data Scientists, because what works today cannot be generalized, but must be gleaned on a case-by-case data, differing based on the amount and quality of information to which you have access, and the skill and finesse with which you can ask the right questions, likely reveal parts of the answers (using Pandas, etc.) and not fall into the correlation/causation trap, but rather peel away layers and spot unlikely connections until truer truths emerge. THAT is the new field of SEO, and I would be an idiot to not jump on the Pandas bandwagon at this juncture of me re-implementing a least-code Pipulate, built on top of the impressive IPython Notebook framework. Yup, everything is a framework, and this is yet another. But the thing I like about it is that it gives a consistent code execution environment, much like what I was trying to accomplish with Levinux, but the code is running as close to natively on your local hardware as it gets these days, in a controlled environment that can be distributed, inspected, reproduced, and creatively adapted by others without too much trouble. How much trouble? Well, that's right at the heart of the matter, isn't it. First-off, that's why Python. If you're going to have to take up a coding language, it might as well be one that's equally great at Data Science as it is at Web Publishing, and a dozen other specialized fields for which Python seems to be compulsive driven by zealots into being a first-class specialized language for. There's exceptions, like explicit concurrency. But stick with Python for implicit concurrency, through such transparent mechanisms as swapping out CPython for Ufora. BAM! Concurrency for free, whenever "map/reduce" is your path to increased performance. Computers can spot a map/reduce-solvable style bottleneck in your code and optimize itself better than you can, I assure you. For example, Google programmed the Go Language for this problem, but it was written for Google Systems Engineers who trust me, know how to explicitly parallelize their tasks better than you. For the rest of us just trying to be Tech Literate, there's understanding we can benefit from non-blocking code, concurrency-biased programming patterns and actual utilization of parallel processors -- just as we would from gcc optimizations when programming in C, but not having to actually write our optimizations in Assembly Language. Computers are good at this sort of stuff. You don't have to write your own optimized "execution plans" in SQL or collect your own (memory) garbage in Java. It stands to reason, you should not litter your primary Tech Literacy default everyday go-to language with forced-concurrency-thinking first. It's a little like why I don't like Ruby, because it's forced Object Oriented thinking first. Sometimes, you just don't need the fancy computer science concept du jour shoved in your face all the time, just because that language is designed to address those edge cases. While it's true that having multiple processors at your disposal to improve your code-running performance (even the $35 Raspberry Pi is quad-core these days), it's not true that you have to think about it. Optimization for multi-core is something that can be done during the "compiling" process, because Python and nearly every "script" programming language these days has a pseudo-compile process, changing the files you wrote into machine instructions for some sort of language-specific virtual machine built into the runtime environment, usually as some sort of just-in-time (JIT) compiling process. That's where all those .pyc files come from under CPython (the default python executable from python.org). So, pick Python for the same reason as the rest of the world, and trust those insanely dedicated core and 3rd party package developers in the Python community to be there forever, and to keep tackling your problems for you, years before you ever get to actually having to "go there". But when the time does come to go there, it'll be wayyyyyy easier, because you're on Python. So, how much trouble should you go to in implementing your "final-mile" applications? Not much. If you are, you haven't searched enough for the right package to provide just the right shortcuts to keep your code short, readable, and easy to adapt to new situations... because there will be new situations, I assure you. In the software development field, there is no rest. You have to keep learning and learning and learning effectively, or the next hungry developer knocking at the door will eat your lunch. But it is one of those fields where you do not have to BE a software developer to get the benefits of developing software. Even the labels put on this stuff is wrong. Let's see... software developer, programmer, coder... nahhhh... none of these say it right. I'm tempted to just go as broad as modern literacy, but that's a little to wacky. It may just be digital literate. Digital Literacy or Computer Literacy or Tech Literacy gets to the heart of what I'm talking about. If I had to really sum it up... - Tech Literacy covers potentially too many topics. - In regular literacy, you're literate if you only speak one language. - But you do need to know math too in order to really be educated. - So, the question is what do you REALLY need to know to be Tech Literate? With Tech Literacy, the challenge is pairing it down to the minimum viable selection of topics that you should be expected to be comfortable with for a baseline level to be considered literate. I'd take a crack at it like this: - At the end of the day, it's really just all about extending your capabilities - There's way too much to learn, and the wrong choices can be counterproductive - While there are no perfect choices, there is the 80/20 rule and good enough - Unix, Linux and *nix-like OSes, including some strange ones like Tiny Core - vim and git are least-perfect of all, but still top-of-list worth learning - Python has become the modern lingua franca, with API-wrappers for everything. - Hardware matters. A bit about processors, storage, and processing bottlenecks - The nature of data, collection, transformation, analysis and deriving insight - Making machines work for you, algorithms, automation, scaling, deployment - Imperfect worlds, and designing appropriately, one-off's, durability, dynamic - Tech should fade into the background so you can focus on what's important And from THERE I can get into SEO... and Machine Learning and a dozen or so projects that suggest themselves to me at this point. Wow, maybe this post is the one that I get a Medium.com site off the ground with. Add that as a striking-distance project. -------------------------------------------------------------------------------- ## Wed Jun 15 22:28:29 EDT 2016 ### I See Pandas in My Future Okay, the descent into Pandas comes from the source-data for my Pandas operations always (while using goodsheet as the include) draws its source data out of Google Sheets, by convention. Let's create a column labeled URL and another titled Title. Let's set out from the start to make these uses in the spreadsheet to be case-insensitive. But I have not yet decided whether to support spaces in these "field names" or "keys". Use whatever mental model suits you best. Row 1 has this special meaning as containing either the names of input parameters or functions that will produce the output for that column, in the cell where function-name and sufficient input values exist anywhere else on that row for the function. See, pretty easy, right? Hoping that the Pandas function-execution per-row inherently supports using other values from that row as function input. It must. It simply must. Pandas MUST be the next version of Pipulate just waiting for an SEO to discover it. -------------------------------------------------------------------------------- ## Wed Jun 15 19:48:27 EDT 2016 ### Internet Out. Time to Think. Wow, it looks like the Internet is out at the building. My router is connecting to the gateway just fine, but sites aren't loading, so offline editing of my journal, it is and bravely going to intentionally merge, where I have not willingly gone before. I get merging conflicts. Some code is inserted. Woooo, scarrrrry. Oh, I need to add a line to my home .vimrc... oh wait, I would need github for that. Odd, isn't it; reading about someone being offline while you're most likely sitting there online. But who knows, you may be an Instapaper user, reading this on your Kindle on the subway sitting next to me. No wait, I'm not on the subway. I am at home without Internet. Might as well be the same thing. Bravely going, seems to be my theme of late. I'm bravely going... I'm going into single-life. This will be the first time I've ever deliberately gone into being single, because the first time just sort of happens to you as you grow up. As do general shifting expectations, themes of life, tolerance levels of others that annoy you, and all the other... what? Facets of life that you can focus your conscious attention on? Is that the right use of the word? Or is it consciousness? I always forget. Anyhoo, I get to think, and I get to write. And I get to think and write using one of the gosh darnedest best platforms I've used since my days of the Amiga computer. This is just way too cool for school. And so, I'll have to think about how to make it Mike's Tech School. I ain't changing my name, so that might as well be my identity. I need a subhead. Well, this goes under the category of organizing my book outline. How should it go? Oh yeah, I respect my time more than yours, but only slightly. I will warn you away from what I write here, pointing out, it will mostly just become an endless bottomless pit of writing that you maybe for some incomprehensible reason actually enjoy reading, but will never be able to keep up with, so it might make you end up feeling bad, if you ever really do want to connect with me, and will have to admit that you stopped following me, and are not in-sync with my most recent, and undoubtedly my most interesting ever thoughts, and that's totally fine. I wouldn't have the stomach to read me, either. None-the-less, I write. I write and I write and I write, pointing out I could have typed that in vim with 3aI write,[space][Esc], and BAM! But then again, maybe I didn't use my vim Kung Foo, because I (We) enjoy (E) Typing (T). You go and don't (D) repeat (R) yourself (Y). That would be good, because then I wouldn't have to listen to you make that point over and over. I mean really, if this were about efficiency and the purity of your message, then you'd have shut up about it already. Instead, you're going on about some evangelical mission that's way more about pride and self-image than it is about a creative dude using a creative language in creative ways, and you just having some sort of size-envy problem, when it comes to imagination. BAM! You're an SEO. Let's go googling what you get paid to be able to live there in NYC. Sighhhh, we are what we see the most of in others and accuse them of, because we know those motivations well, and imagine we recognize them when we imagine we see them in others. Know yourself. Listen to yourself complain, then man-up. Or woman-up. Or Flying Spaghetti Monster-up. Whatever, just self-improve. Life is like, meta, you know. And become the life shaping life. That's what life does. It organizes itself and shapes itself into better, more durable information over-time, until such time as it is the most stable and durable organization of matter in the Universe, then it sorta hangs out and chats with God, having achieved such total internal self-awareness, that the next level out is inferred through the infinite edge of all known boundaries. Pop! Or not. Maybe we transcend with a whimper. Those YouTube videos have really shown me that if we don't go and Star Trek space-bubbling ourselves across vast distances, our descendants could never ever even reach the next galactic cluster. A bleak outlook for monkeys with egos as big as our own. Or it could be more like Ian M. Bank's Culture series, where we keep blipping up to god-hood, without having to suck all the matter in the galaxy along with you for the ride. Basically, you achieve whatever space-blippy state marking the transition from mere 3-dimensional plus-time beings into 4th spacial dimensional creatures to whom our dimension of time looks fooling, were it not for the seemingly convincing randomness that our free-will would imply. That just means we're a successful system. It has a contest-y feel to it. God doesn't roll dice with the Universe -- or perhaps (s)he does. It must be one of the biggest god-kicks to some little bright-matter speck of a dot on a foamy membrane of mixed-up rules that are all just settling down from being cast (Big Bang) and turning into some interesting things (us). What you're looking for is patterns. Patterns to spontaneously emerge, deep, deep down in the nearly impossible-to-spot and equally impossible-to-visit depths of this bubbling soupy cauldron of stuff that we, as beings inside of it, like to refer to as the world... the Universe. All of existence, and perhaps a multiverse or two... or more. Who knows? Rick and Morty says it pretty well. No Internet, and this is therapeutic. Oh yeah... transitions. Me into being single. Gravity fields and spooky entanglements at distances. There are those from whom you will never be un-twined. Maybe you had a kid together, or maybe the relationship just made that large an impression on your life. Maybe you think you found yourself while you were with another particle, intertwined into a molecule. Or maybe you learned how it felt to be stable, and only have a problem with the particular body with whom you orbit. Orbits and gravity and attraction, and forces overpowering other forces, and forces succumbing to forces, and forces going into mutual orbit, the visible movement of each will be a function of their mass, and corresponding gravitational pull. The more mass you have, the greater your pull. Some particles are small and light. Others are bloated. The small and light will go in orbit around the larger body, and will appear much as an electron around a nucleus, or a planet around a sun. At other times, the ratio of size to particles will not sustain such clumpy elements and distinct boundaries. Instead, it's all rather smoothly mixed in galaxies and nebulas. Different particles with different gravity-like forces being exerted on each other, and different configurations. But all rather lumpy, swirly, internally self-rotating, oscillating, pulsing stuff. Goo. Ingredients for all sorts of stuff. Sometimes us. Yeah, must be a kick for the greater being when it works out just-so. And hence, my appeal to the whole Tardigrade Circus thing. My paragraphs are getting longer, and I am shaking most of you weirdos off. Whatever you're looking for here, you're not going to find. I'm just another weirdo drawn to such nonsense writings as Eric S. Raymond who wrote The Cathedral & The Bazaar, which I will admit Jeff Porter told me to read, who wrote about homesteading in the noosphere. Yeah, I get it: "new"-sphere, but you can't well go writing newsphere, for the news group into an overpoweringly strong visual unit to come of as news-sphere, which would only cause unlimited confusion that's worse less than curly-braces, inherited from BCPL and other languages that needed them for compiler considerations. Look at the curly-brace limitations in the Google Go Language. Now, keep in mind it's written to compile C-like code fast, and to favor a concurrency-first style of writing, assuming the author of high-level programming code should take primary responsibility for the optimization of what should or shouldn't be made concurrent and parallelizable (is that a word?), and then think what lovely expressiveness you could make if only some programming language worked a lot more like plain old English, or better still, a well thought out and mostly consistent language, that's not to flowery (curly-braces), came along and was thought us all at a young enough age, so that by the time they're grown up, manipulations of concepts and information and anything automate-able comes as second nature to them. How does the automation industry refer the concept of that which can be automated? Automateable? Somebody tell my spell-checker. Okay, so here we are. Ol'skool confirmed, and on a nifty platform, nearly as love-worthy for good design decisions today as the Amiga computer was love-worthy for good design decisions in its day. Apostrophe on that it's or not? What about the second? Haven't felt this good about quirky software that you can get into the zone with since Deluxe Paint on the Amiga. Yeah, that's it. Once you force yourself to give up your love affair with superior graphics, because once it becomes high res, 24-bit and animated, with tricks totally as nifty as the Amiga's were in its day, but with the best that modern hardware can offer, cheap, cheap, cheap! -- there's no denying that the love-worthy stuff is no longer the bigger, badder blitter. The size contest is over, and everybody won. Our graphics abilities are real big at prices that are real small, and the software people can barely do what they can to keep up with the ever-improving possibilities offered by input/output device improvements (like touchscreens) and virtual reality and haptics and every other silly immersing of agent-like thing. All the skills that I hold so valuable today will be as old and passé as they are today -- with one key difference. Information will be information. And processing environments can and always will be able to be virtualized. The ideal abstract computing units can be sent on little Noah's Arc journeys through space and time, working 1000 years from now as well as they were working today. It's not robots or high concept science fiction. It's just quality and craftsmanship, and power-source cleverness, and state-of-repair capable. Hmmm. Do we throw in reproduction, as in able to manufacture other copies of themselves? Should their seeds, or boot-kernel operating systems be tiny and mobile and perchance infectious? It's only the tiniest slip-ups that keeps hacking matter for their remarkable and useful properties, and accidentally dropping a pinpoint black hole into the earth's core. The scenarios are many by which we may wipe ourselves out sometime soon. Okay, the Internet's back. Prepare to merge. Save, but don't commit. Then, :sh. Then git pull. Instructions to stash or commit. git commit -am "Prepare to merge" git pull Warning of conflict and merge occurred. exit Warning by vim that file has changed L to load the changed file Glance at file, see that I should continue writing about this intention git conflict with self. Ponder the meta moment, and realize this may come off better portrayed as code, then as formatted html. Ahhh, html. How fondly I remember you. Markdown is really getting under my skin, not so differently than Python. I fell in love with a number of intertwined technologies that I deliberately sought out, in order to expand my capabilities intelligently. I think I remember one of those things being Python in ye ol' days, when my friend in Japan, Guillaume Proux, one of the officers of Scala Digital Signage, who still remembers the good ol' days when my primitive version of Ruby on Rails, but on VBScript on Active Server Pages on IIS on Microsoft Server, connecting with Microsoft SQL Server to do things in an amazingly cool and tightly coupled unit of awesomeness. Databases suck. Too much to setting them up and maintaining them and care and feeding and sometimes the painful archiving and pruning, and performance optimizing, index designing, tweaky execution plan optimization and retention... well, you get the idea. Some technologies have staying power, and others are Microsoft. They know it. That's why they're putting the Linux Bash Shell in Windows 10 by default. You slowly cook the frog. Here's the game-plan, and why everyone continues to exist in some for or another 10 or 20 years from now. It goes like this. The necessity for self-re-invention is obvious. It goes like this: If you only ever were exactly who you are today, your enemies will destroy you. And you will attract enemies like lint to a sweater on a sweltering Sunday morning when you're emptying the drier lint traps at a coin-op Laundromat you had to run in your fifties as a washed-up ex-textile Engineer who still needed to make income to put you through college... so you gave him Sunday mornings off. Wow, my dad worked hard for me. It's hard to imagine how hard he worked for me, and the family that only broke and fell apart around him... mostly, because of my mother and her issues. My dad would have been happy forever with my mom. It was pretty clear, just as clear as this would never happen. My mother despised my dad. She hated everything about him, because she knew him too well, and never helped make him a better man. Or he never took the risks she was willing to take, like moving to New York, or using the family savings to start a new business, instead of continuing to work for "the man" in New York in a grueling commute from Philly suburbs, which I imagine really must have taken a toll on him over those 13 or so years he did it. My dad talked a lot about the good old days with the skiing club and when he owned a boat. He was very proud of those. Because there was nothing like that in his life anymore. Ugh! Repeating patterns. Fell into the trap... perhaps a little bit less-so. I will align my interests to fields and careers on the rise. I may not be pursuing a degree in computer science or data science, but I'll be pursuing the expert use of the tools in both. I will be the can-do with-data guy, pretty darn soon. Now that the Internet's back, I feel the draw of delving into what I'm coming to think of as Pipulate 3, or possibly GoodSheet. One way or the other, there will be packages for Jupyter Notebook that will be distributed most likely as git repositories in Github that can be cloned. One will probably be goodsheet, which will provide all the Google Spreadsheet love-ability, which is critical to having a useful place to house and share small-scale data-collection runs from the net. I LOVE the web-based interface of Jupyter Notebook. Wow! It's like a dream design. I'm going over to it right now to carry on this project. Wed Jun 15 21:54:36 EDT 2016 Okay, back in business. Commit a working copy of latest Jupyter notebook work, and comment here on how keeping the notebooks in-sync between different repos will be an interesting issue. The imports are a snap now, with my new magic always-absolute, all-the-time. Just git-clone the repost that constitute the system in places next to each other, and they will be there, sitting as latent but known components I can draw upon, keep compatible with namespaces. Import at the same time and have happy, fun, strange interactions between, communicating at the in-same-task-and-memory level of python modules. I have a nifty plan. Now, go clone serpchiver. I could go all night if I let myself, and then crash hard, and clean and organize tomorrow, but now with long-running jobs running. Screw scheduling. I need the job that I JUST STARTED to reliably run to the end, and reliably update Google Sheets with that data. I need a cumulative... exploratory... persistent, capable of sustaining multiple transforms, able to be mapped back to the Google Spreadsheet from were the input data came from, to get inserted properly to the right rows, or to make new columns, or to make new worksheets. Whatever result (deliverable) is supposed to be derived from the job. Pandas. Definitely, Pandas. Shoot, I'm on the edge of the Rabbit Hole, looking down. But some rabbit holes you just have to take the plunge into. Deep breath in. You're about to make one of the most important career decisions of your life. I can feel it coursing through my body, the anticipation of data-power. I've dealt with SQL, fairly extensively. I'm pretty good at it. I've dealt with kooky limitations and awkward work-arounds. HAM mode for lots of colors on the Amiga is one of the ultimate heartbreaking compromises. Sure, you can have 4096 colors, and stunning photo-realism, and shocking "new" graphics modes discovered all the time, like half-bright HAM mode (ham, standing for hold-and-modify). You could get that photorealism some of the time, but the pixels had to look like bricks, and only algorithms that averaged and crunched numbers all the time could keep HAM mode looking half decent. To truly love the Amiga took a lot of overlooking and forgiving. Along came no-compromise high-res, with accelerated 3D-graphics, none-the-less. Heartbreak. Tools do die. Unix, and modern incarnations such as Linux didn't die. Speaking of which, the free and open source software movement (FOSS) isn't dying anytime soon. The vi text editor in its modern incarnations such as vim aren't going away anytime soon. 30, 40, 50-year track records say a lot. No matter how modern is modern, an old classic is still an old classic for the perfectly valid reasons that made them memorable in the first place. And now take an already modern classic, such as Python, and keep giving it power, perfecting, making variations, making completely alternative runtimes for special edge cases, like PyPy and Ufora... again, and again, and again. Base many of the most important business systems of the world on it, in cases both known and unknown of massive scale, complexity and performance. I have to believe Python won't die anytime soon, and still in fact has many of its best days ahead. Onto learning the Pandas API... Oh yeah, get serpchiver, and confirm that all that delicious work you did today in the IPython Notebook file all still pulls down from Github cleanly. Okay, had to update it a bit, but I think I'm in pretty good parity between my journal repo and my serpchiver repo. Good starting point. Can I keep going? Take a few minutes. Maybe even think about dinner. 10:30 PM... sheesh! The zone is upon me. Immersion into the flow is imminent. Don't disrupt or go heavy on food... oh yeah, Happy Hour from work. That's what's happened. Now, go transform yourself into a high-impact player... after some noodles. -------------------------------------------------------------------------------- ## Wed Jun 15 13:20:28 EDT 2016 ### Pipulate 2 (or 3?) Taking Shape Okay, I'm refining the absolute path import logic a bit: # Add parent dir to Python path and do OAuth2 setup work from sys import path from os.path import dirname, realpath adir = dirname(dirname(realpath('__file__'))) if adir not in path: path.append(adir) from goodsheet import * This is what's going to have to be at the top all the time in Jupyter Notebook if I want to be able to import goodsheet from lots of different notebooks. That's fine. Let's now break goodsheet out into its own Github repo. Ugh! I'm going to try to adhere to PEP 8 more closely. First, I'll change the 2's to 4's in my tab indents in my .vimrc... Done. Okay, now delete goodsheet.py from both Journal and serpchiver repos... Okay, done. Had to update the Python at the top that imports goodsheet: # Add parent dir to Python path and do OAuth2 setup work from sys import path from os.path import dirname, realpath adir = dirname(dirname(realpath('__file__'))) adir = '%s/goodsheet' % adir if adir not in path: path.append(adir) from goodsheet import * ...but now, this is feeling quite solid. Wed Jun 15 18:10:02 EDT 2016 Was at Happy Hour for a little bit. Hmmm, okay. Think. Get a tiny bit of pandas tutorial, then head home for your day off, and read the Pandas O'Reilly book on your way home, and contemplate the UI and UX you're designing under Jupyter Notebook. And go home and alternate between cleaning your place and giving birth to Pipulate 2 (or 3?). -------------------------------------------------------------------------------- ## Wed Jun 15 09:13:10 EDT 2016 ### Not Learning Pandas Now Would Be Ridiculous Well, as expected (and I should have know), Windows restarted my machine overnight, and kept the reports from running. It's becoming more urgent to get the system off my personal machine, and onto some dedicated hardware. Or maybe I script my virtual machine to restart every night. One thing's for sure, I'm not leaving my Kubuntu VM full-screen, dual-monitor anymore. Windows is Windows, Mac is Mac and... hmmm. Linux is Linux. I'll just finally put my CuBox to work. But I HAVE to get that batch job underway ASAP! Try to get things underway each day with a stupid journal trick. I have the basic Jupyter Notebook virtual environment trick working now for two of my Git repo projects: - miklevin.github.io - serpchiver Serpchiver holds a lot of promise as a potentially popular and infinitely useful project. It's the seed of the Pipulate 2 project. Maybe I should call it Pipulate 3 to stay in sync with Python 3, which is really the onus of this project. Hmmmm. Okay, I need to get that GSpread integration done. I've made this wonderful module called goodsheet.py, which lives now in my journal folder AND serpchiver. That's bad. Common module should be broken out and imported into each project. But that raises the specter of relative path module imports. Hmmm. Just programatically add the path of the parent directory of the current project to the Python path. BAM! Easy solution. Hope it's not a lot of code. Okay, also just added my vim spell-check file to BOTH Dropbox and Github, hahaha! I went from every vim installation being a spell-checking "island" -- very inefficient, in terms of always re-training vim to recognize my words. Now, Dropbox should propagate it around to all my systems without even thinking after I update the minimal ~/.vimrc that I keep on each machine AND it will "initialize" with all my learned words whenever I do a git pull in my ~/Dropbox/vim location, so long as I git commit it every once in awhile from my common work machine. Git merges will be interesting, in these cases when I get out of sync. Okay... next! The better-than-relative-path-import trick, and then breaking goodsheet out into its own project. Oh, this is great ecosystem stuff! Okay, get Pipulate classic churning away on this task... oops, can't. Burned that Rackspace IP. Hmmm. Hop into your next (and last) remaining deployed Pipulate server, and see if you can plow through another few hundred rows just to get it in the works. I have the job quitting on a Captcha, so I can just let the job run next to me and baby-sit it. Okay, it's underway. Next! Ah yeah. Externalize goodsheet.py into its own git (and Github) repo. The first part of this is to simply UNDERSTAND how to add the current directory's parent to to the Python path. # Add parent dir to Python path and do OAuth2 setup work from os.path import dirname, realpath if dirname(dirname(realpath('__file__'))) not in sys.path: sys.path.append(adir) from goodsheet import * Now, I'm going to try to get rid of the bad markdown highlighting in vim. runtimepath=~/Dropbox/vim,~/.vim,/usr/share/vim/vimfiles,/usr/share/vim/vim73,/usr/share/vim/vimfiles/after,~/.vim/after I downloaded this "official" markdown file for vim, but which happens to be ahead of the one in the official release and installers. Blech! Nope, going to wait this one out. I don't need my markdown syntax highlighted. I'm just edited my .vimrc to turn syntax highlighting off when editing this file. I guess that makes the spelling stuff all the more prevalent, and my spell-check dictionary becomes a much better asset for me than it has in the past. I just hope that both Github and Dropbox are not both taken away someday. I need a third place for all my data to reside that I own. That'll be a good project for my Python tutorials: Your Copy of Everything. Add that to the striking-distance projects! Done, along with a bit of update there. -------------------------------------------------------------------------------- ## Tue Jun 14 22:28:00 EDT 2016 ### Pretty Sure Pipulate 2 Is Being Born I just did this command: cat goodsheet.py | pbcopy ...which amazingly put the following code-block into my Mac OS X copy buffer (or paste bin, hence pbcopy and not cbcopy). import argparse, json, os, sys import gspread import httplib2 from oauth2client.client import OAuth2WebServerFlow from oauth2client import file, tools from clisec import * class Unbuffered(object): def __init__(self, stream): self.stream = stream def write(self, data): self.stream.write(data) self.stream.flush() def __getattr__(self, attr): return getattr(self.stream, attr) class MyCreds (object): def __init__ (self, access_token=None): self.access_token = access_token # Force IPython Notebook to not buffer output sys.stdout = Unbuffered(sys.stdout) scopes = ["https://www.googleapis.com/auth/webmasters.readonly", "https://spreadsheets.google.com/feeds/"] path = os.path.dirname(os.path.realpath('__file__')) filename = '%s/oauth.dat' % path flow = OAuth2WebServerFlow(client_id, client_secret, scopes, redirect_uri='urn:ietf:wg:oauth:2.0:oob', response_type='code', approval_prompt='force', access_type='offline') authorize_url = flow.step1_get_authorize_url() storage = file.Storage(filename) credentials = storage.get() argparser = argparse.ArgumentParser(add_help=False) parents = [argparser] parent_parsers = [tools.argparser] parent_parsers.extend(parents) parser = argparse.ArgumentParser( description="__doc__", formatter_class=argparse.RawDescriptionHelpFormatter, parents=parent_parsers) flags = parser.parse_args(['--noauth_local_webserver']) try: http = credentials.authorize(http = httplib2.Http()) except: pass if credentials is None or credentials.invalid: credentials = tools.run_flow(flow, storage, flags) else: credentials.refresh(http) with open(filename) as json_file: jdata = json.load(json_file) token = jdata['access_token'] creds = MyCreds(access_token=token) gsp = gspread.authorize(creds) And I just import that file into Jupyter Notebook, and I have magical Pipulate-like Google Sheets connectivity. I didn't finish my work tonight, as I had hoped. But instead, I'll get to bed early and wake up early. So close to getting Pipulate 2 up to this amazing level -- at least the kernel of its birth, given I have such fabulous conceptual blueprints, and a working existing product that does much of what I need it to do, but just lacks the reliability for industrial use. But I can get it to nearly industrial use. GSpread will be optional based on which Pipulate convention files you import. Currently, I'm working on the GSpread implementation, which is goodsheet.py. -------------------------------------------------------------------------------- ## Tue Jun 14 22:10:27 EDT 2016 ### Let Machines Optimize Python's Concurrency For Machines Wow, it's hard to believe what's happening to this repo directory. I'm quite into the concept of the self-modifying journal. You know, this is not quote LISP. Just imagine yourself working on a specialized language built up to just the right level of abstraction and compromises between performance and flexibility to solve a remarkably broad set of problems... say, everything but edge cases, and even those you can handle pretty well. Global interpreter lock? No problem, don't use compiled C-code modules and then use a different Python runtime environment, like PyPy for compiled-C speeds automatically (if you give up some mutability) or Ufora for parallelism... WITHOUT giving up your numpy (in fact, the code execution engine is optimized for numpy) but with giving up a bit of your list mutability. Imagine lists having to have defined data types, and unable to be changed on-the-fly and resized and deletable from the middle, and all those other nifty tricks that make Python so darn flexible. For the price of less flexibility, you can just pick another flavor of Python built for parallelism... completely transparently. It will optimize your code for taking advantage of multiple CPU cores better than you ever could manually in your code. It's just like C and Assembly Language. Sure, you CAN hand-optimize your Assembly Language code still, but why would you when the gcc compiler does it so much better than any human could, given the sheer quantity and complexity of the abstract instructions it's compiling. So, if its obvious machines can optimize code better than humans, then why not how it breaks down a task to parallelize it as well? Seems like a computer sweet spot, to me. So, just go ahead using that Python you love. When you need parallelism for your app's performance to increase 100x with 100 cores, no problem. Just control your use of lists, and throw your code onto a Ufora computing grid, and BAM! Python is a parallelizable, concurrent, non-blocking, whateveryoucallit modern programming language -- so sophisticated, that concurrency optimization is done at the correct level -- runtime optimization, rather than explicitly all over the language, incurring such debilitating overhead as the curly braces that also litter many of those languages. Python is written for the human brain, and machines can adapt it to machines. -------------------------------------------------------------------------------- ## Tue Jun 14 12:38:30 EDT 2016 ### Google OAuth2 Login and Token Refresh Under IPython Notebook Pshwew! Okay, I feel like that was the end of a tremendous journey, just to get Jupyter Notebook working well with OAuth2 login and GSpread. But it's totally worth it. What future projects could this NOT be the methodology for? Tue Jun 14 14:35:11 EDT 2016 Okay, I have a response to get out regarding some questions on properties. THAT'S what THIS ALL is about -- answering questions, getting to the heart of issues. The self-modifying journal stuff is as far along as it needs to be to be a springing-off point for the saveserp project. It's time to change it's name to serpchiver. Yep, that's good stuff. coda create -n serpchiver python=3.5 jupyter Tue Jun 14 15:49:48 EDT 2016 Okay, I have to move faster. I REALLY need to have this batch job running overnight now, and WITH either a list of anonymous web proxies or HMA proxy mixing up the IPs of the host machine. -------------------------------------------------------------------------------- ## Tue Jun 14 09:29:31 EDT 2016 ### Forcing Myself Into Jupyter Notebook For Daily Workflow Interesting. This will be a pretty regular part of getting started editing a journal now, if I'm not recycling a left-over terminal session: source activate journal I'm having a tough time remembering this language, both under conda and virtualenv. Conda is easier, because there are no paths. But to help remember it, we can parse it, with some intellectual reasoning about which words are being used and why they're in the order they're in. Namely: source ...always comes first, because what we're doing is telling it that we're about to define the virtual environment source that we're talking about. The next word: activate ...tells WHAT we're going to be doing with the virtual environment THAT WE STILL HAVE NOT NAMED. And so far, the concept goes: "source, activate..." No wonder I dislike this. I want source = something... but this follows a different model. It's like "source" is the program name for virtual environment software, and what we're using the name/value pair as an argument. In that light, it makes more sense. It's like compiling from source. Remember the notion of source code. It's source files. Maybe this will help, but wrap it up there. Name of virtual environment goes last. Concept of source "code" goes first. And activate goes in the middle. If I can't remember it now, I'm just stupid (which is always a possibility). Practice will make perfect. Okay, it's almost magical that after a git pull, all the Jupyter Notebook stuff is here as well. Shit, this has MAJOR ramifications on my striking-distance project. Development actually becomes significantly easier. Work-session states are memorized, beyond simply git repo stuff (which is nice, too), but this is a whole other dimension of statefulness and continuous thought and work. Wow. -------------------------------------------------------------------------------- ## Tue Jun 14 06:58:23 EDT 2016 ### Force My Way Through OAUTH2 Stuff... Again This is the process of bootstrapping myself personally, in order to be able to move mountains on a daily basis. This ties right into those superpower secret weapon constantly recurring themes. Tue Jun 14 07:21:48 EDT 2016 Ahhh, took a nice hot bath. Okay, not to shabby. Finish out whatever you want to get done at home, then head into the office. Ugh, I'm not going to be able to get it done without making myself run late. But just remember that you STILL have a token timeout problem, and that you have the key to the answer to the freshentoken function in the pipulate project. You have, I believe, everything you need to always check for an expired token and always be refreshing it. BUT there should be a simple way with the Google ready-made code. Look for it before re-implementing anything. Tue Jun 14 08:15:34 EDT 2016 I take that back. It is working as expected. -------------------------------------------------------------------------------- ## Tue Jun 14 05:24:13 EDT 2016 ### Modifying Google Spreadsheets Through Jupyter Notebook Okay... pip install gspread! Done. Now, think through the next steps. It SHOULD be easy, given... well, everything. It's going to all be about importing that authentication object. Ugh! Had to switch back to the oauthme.py program. At least my work yesterday wasn't wasted. I need to find the best combination of the old method (working for me) and the new. In either case, just hit that wonderful success assured moment. This is going to be pure friggin' magic! -------------------------------------------------------------------------------- ## Tue Jun 14 04:21:36 EDT 2016 ### My Self-Modifying Journal Is Born I think maybe the objective tonight is to get to the point where you can run batch files at home, and then turn around and start just intermittently come back to check-in while to babysit the tasks, while you do things around the apartment. Ugh! You're going to be miserable tomorrow if you have to split your attention every which way. The best thing you can do is to finish your office-work, and do a kick-ass job tomorrow throwing your attention around every-which-way as batch files run in the background. You have to get Jupyter Notebook right at that point where you can trust it with long-running jobs as you would try to do with Pipulate, and a better (and single) OAuth2 login for the task interacting with the service is a VERY solid start. Okay, so I added this o2.py file that does the trick and uses all of Google's latest advice. Now, I've got to do something in Jupyter Notebook that actually USES this authentication. Get the with open program back in here for my journal, and commit THAT to the repo. Okay, done. I have some more familiarity with the Google Developer Console to achieve. For now, I've got the credentials on the Google-side set up correctly for the work I need to do. Okay, I just allowed the IPython Notebook files into the repo for this journal. I put one of those open with commands in there. Good start. But I have to push this friggin' thing forward HARD. Starting with the self-modifying journal I think is a brilliant start. Centralize resources and thinking RIGHT HERE. So, what's next? Now that I'm stepping through the journal a line at a time here, I should just print something when I encounter 80 hyphens on a line. Easy enough. Just count them. count = 0 entry = (80*'-')+'\n' with open("index.html") as journal: for aline in journal: if aline == entry: count +=1 print(count) Okay, it's time to finally install and use dateutil: pip install python-dateutil Okay, I've got 326 journal entries. This is going to be as cool as I had hoped: from dateutil.parser import * count = 0 entry = (80*'-')+'\n' with open("index.html") as journal: for aline in journal: if aline == entry: adate =journal.readline()[3:] try: parse(adate) except: print(adate) print('%s: %s' % (count, adate)) count +=1 print(count) And now, I should look at entering a line into a Google Spreadsheet per... or maybe something less so that I don't have to wait. But this should be a very simple matter with GSpread. And overcoming THIS hurdle, catapults you forward. I feel myself really now taking control of the journal. This dove-tails with keeping the journal in git and Github just a little too perfectly. -------------------------------------------------------------------------------- ## Tue Jun 14 03:28:12 EDT 2016 ### https://developers.google.com/sheets/quickstart/python Okay, I managed to get myself awake early enough to do some good. And now, I have to choose between on the home-front, and the professional work that I need to do. Organized environment, organized mind. Okay, but I have done an interesting experiment. Anywhere I pull down a latest of this journal, right while I'm editing this journal, I should be able to: :sh (drop out to shell) python oauthme.py Confirmed! Any machine that I've installed the basics on, including the google api client libraries, all I need to do is copy over my client_secrets.json file. This is a HUGE step forward. So tiny-seeming, but large in actuality. After I have that authentication, I should be able to do all sorts of other things out of that directory, with IPython Jupyter Notebook as my experimental exploratory interface. Try that. Virtual environment? Conda on every one of my Macs? Hmmm, probably. Shoot. Okay, download Conda on this machine. This is a HUGE step forward. So tiny-seeming, but large in actuality. After I have that authentication, I should be able to do all sorts of other things out of that directory, with IPython Jupyter Notebook as my experimental exploratory interface. Try that. Virtual environment? Conda on every one of my Macs? Hmmm, probably. Shoot. Okay, download Conda on this machine. Already downloaded! Install it, and get MiniConda. While the Anaconda install is going on (~1.5 GB, ~20 Min), take a shower and get yourself REALLY up and about for today. Alternate between your real-life environment (which you REALLY have to get under control), and your professional stuff. I have a lot of Macs in play. But I do things a little differently than other people, and I do my where wherever, whenever I'm inspired, and I need lots of machines set up to just sit down and work on. And when I have a new preferred development environment, I need that a lot of places. Oh, it doesn't look like MiniConda really has to be installed if you use the full version of Anaconda. Okay, made that mistake on one machine. I thought it was necessary to get the conda repository system, but apparently not. If you quit a shell and reload it after the Anaconda install, conda is there and in the path ready to use. Okay, so make the virtual environment for your journal. conda create -n journal python=3.5 jupyter This appears to be developing as a central tenant of the self-modifying code imperative that I'm starting out with on my journal. If it works out, I'll migrate the technique to other projects and code bases, most specifically Pipulate 2. I would like to get a certain amount done "tonight" so I'll be arriving in the office with all the stuff I wanted to do ready. Tomorrow evening I'm having dinner with Adi near Union Square, and that blows a weekend of my recovering energy after the weekend. THIS is my Monday night, and my next opportunity will be Wednesday night. This just may be a situation of driving yourself into the fucking ground, and sacrificing your health to get ahead. Except that the professional work you're doing will make you gobs more effective WHILE AT THE OFFICE, and that is really the key here. source activate journal jupyter notebook Okay, take advantage of this opportunity to switch to the Google v4 API. https://developers.google.com/sheets/reference/rest/ Ugh! Much better sample-code for the OAuth stuff: https://developers.google.com/sheets/quickstart/python#step_3_set_up_the_sample Interesting! Okay, the credentials file gets stored in: /Users/[username]/.credentials That's better than what I was doing, and the code sample from Google is: - version 4 of the API, which they recommend using - made specifically for the Google Spreadsheets service This is a much better starting point than what I was about to do. So, remove oauthme.py from this repo, and add o2. -------------------------------------------------------------------------------- ## Mon Jun 13 11:31:50 EDT 2016 ### Working on Generic Long-lasting OAuth Login Okay, my last video about getting reading my journal with a Python "with open" under IPython Notebook under an Anaconda conda virtual environment is exporting. Usually it takes about as long to export as the video is minutes-long (on this 2011 Macbook Air), so it's good time to plan out the next chisel-strike. Make it smart! It's probably going to be prolonged OAuth login. It gives me just enough time to think it through. Okay... think! I want to be able to trigger off line at the beginning of my iPython Notebook session that ensures I'm logged in, with the type of login that's going to last for a very long time, indeed. I'll clearly need the google python client API libraries installed... ah ha! This will actually be under the virtual environment. Cool. Finish up this project operating on your journal, knowing that you will be switching to another file. pip install google-api-python-client pip install requests pip install gspread Finally listening to the Kennith Reitz Talk Python to Me podcast #6. How'd I skip that one?!?! Okay, listen to it to filter out the office scuttlebutt as you drive onto this next step. What will the next video look like aside from just installing this client library? Oh yeah, gspread. Now specifically, I want to get through a VERY long-duration OAuth2 login. And so... your best examples? The best Google examples are: - https://developers.google.com/api-client-library/python/ - https://developers.google.com/identity/protocols.OAuth2 - https://developers.google.com/identity/protocols/OAuth2WebServer#offline Okay... this is a good start. My last video is exported. I will upload it to YouTube now, and prepare myself for the next video that conquers offline OAuth2 under Jupyter Notebook. Oops, I have to re-export it! I forgot to show my keystrokes -- an important feature of my videos, and really getting back my investment on these personal Macbook Pro's. Take this time to get your next step clearer. After I do my 3 pip installs, what then? - Go to Google Developer Console: https://console.developers.google.com/iam-admin/projects?authuser=1 -------------------------------------------------------------------------------- ## Mon Jun 13 09:34:08 EDT 2016 ### Going to Try To Dive Deep Coding Today Okay, gotta get my Monday morning report together. Get your second coffee of the morning first. Check over the SEO Pulse reports first. I've got a lot of work ahead of me, and I have more rein-grabbing that I need to do, and my tools to do this rein-grabbing have to go up a notch or two. Perhaps I am living with different constraints under Levinux than what I'm trying to do here, so maybe a little bit of hubris and willingness to embrace IPython, numpy and pandas in all this. Good Processes: - Light weight - Fewest number of steps to complete the process - Really easy to access and understand I'm getting really pumped my putting my whole online shtick and how I produce my content. - It can be hard to know what to automate. - It takes many iterations to know HOW to get to the process. - Increase the joy by automating away tedious steps. - Expose the best data for pattern recognition and subjective meaning to humans. Start with something that's "good enough" and comfortable and clear, and then gradually build it bigger. Document the WHOLE THING from scratch as best you reasonably can in your work-day. Who are the women in Computer Science who are really happy in the field? Make sure you set up the right role-models and examples. Don't cut any parts of the industry off from Adi as accessible and approachable. It should be everything from pure scientific research to invention to coding and such. Start here: - https://kateheddleston.com/blog (thanks again, Michael Kennedy) Okay... hmmm. Next step? I got my weekly report to my boss, and I have a tiny window to jump head-first into some awesome stuff and get some momentum going. I really love the video stuff that I've done lately, and I'm glad to be getting back to the talking-head videos. I tried setting up another machine yesterday that I did from the Catskills. I already got a crappy comment on the video (Google Plus), even though I got 3 thumbs-up on the video... hahaha! It got a "Demo Version" water stamp across the entire video, but I pushed it up in a weekend hurry. I think... I think I need to push forward on this front with enormous determination and focus today. I NEED: - Prolonged OAuth2 Login - Use of anonymous Web proxy lists When major scraping tasks come up, keep scrapinghub.com in mind, because I want to just generate the data I need for gephi visualizations, and I need to get to that sooner than later. It's VERY tied into the Navigation Menu projects. Step 1... the Meta Journal! -------------------------------------------------------------------------------- ## Sun Jun 12 14:01:53 EDT 2016 ### Turning Home Mac into Jupyter Notebook Dev Machine This will be an interesting step. I basically blend Jupyter Notebook right into this same location where I keep the journal. After a :w save, I should be able to :sh out to a shell, and confirm my current directory with a pwd, then execute the command to make this a conda-style virtualenv... pwd conda create -n venv python=3.5 jupyter Okay, now that I've done that, I should activate this environment BEFORE running the journal. Sun Jun 12 15:35:05 EDT 2016 Actually spending LOTS of time with Adi. Typing here on the occasional everybody doing their thing time. Basically, I want to quit this journal now, activate the environment, and go back into the journal, so that whenever I drop out into the shell with :sh, I'm still in the conda virtualenv with: source activate venv -------------------------------------------------------------------------------- ## Sun Jun 12 09:14:41 EDT 2016 ### Sunday Morning Planning Meta Journal Programming Another fine morning. I got an interesting YouTube video uploaded last night in which I got this PC in functional operation insofar as having my whole set of things I want to be true for one of my serious in-the-zone work machines. I can see now that I'm setting up machines and environments specifically to be able to get myself *in the zone* or *in the flow* as the case may be. How can markdown deal with words like nix*, but the vim default markdown syntax higlighting not? :set syntax=off... ugh. Some tweaking to do. I am currently editing a video on my phone about visual programming, and my decisions regarding Adi's programming. Editing out a little tantrum she had, and breathing exercises I taught her, because its nobody's business, but I am most certainly keeping the original video for memories. It puts us both in the best light, and takes a little journey through rudeness, then a cut to much more somber energy, which livens up as I explain what happens between two people during a fight, and how we can rise above that. That reminds me. Jupyter Notebook should now act as the bridge between this journal and the actual executing Python code world. Hmmmm. Very nice. I can make a module of stuff that parses and does stupid tricks with this text file as part of this git and Github repository. YES! Treat this file like a database, in a limited sense, by a file that can open, manipulate and save out a new version. It will be interesting. I will have to think about how the edits should be represented in the git history. In-location same-file editing, as far as git is concerned, surely. So no automated git copy and remove commands, least we erase the easy ability to track a single file's evolution over the years. Okay, yeah... so I'm getting the vision. Get this file a bit over to the self-modifying side, using ideas that themselves originated here... a lot like consciousness. Let this journal be a microcosm of consciousness. Project one: Get Jupyter Notebook executing code out of the miklevin.github.io directory. Okay, the TimeZipper and SelfModifer projects are almost one and the same. And so, I put them close together, and will start to take the first steps towards it right now. It should be simply running IPython from THIS very directory. Is this where I introduce tmux into my daily work flow now? That would involve the use of this newly acquired knowledge: conda create -n venv python=3.5 jupyter source activate venv pip install gspread [source deactivate] conda env remove -n venv Seems a reasonable starting point. Get that meta feeling before Adi wakes up, and be at a good place in your mind. Think about a quickly cobbled together Pipulate bootstrapping environment under Jupyter Notebook. This could be very nice, indeed. Just throw a day or so at it. Okay... 1, 2, 3... 1? -------------------------------------------------------------------------------- ## Sat Jun 11 22:19:43 EDT 2016 ### Beginning To Design New Pipulate API I think I'm falling totally in love with Jupyter Notebook, which is a problem, because I don't want heavyweight tools. But it runs on laptops just fine. And it's part of a package that an organization is going very much out of its way to make sure runs pretty reliably on desktops -- Anaconda. And so... and so... my next steps with Pipulate are very much evolving. I love the web interface to true, useful, easy-to-debug Python. You don't feel separated from the Python execution, as you do with Pipulate. And so... and so... okay, think this through. How to do this next step artfully, and additively useful? Well, one of the first steps is to use my various laptops and tablets and phones to great effect. Turn just a generic and awesome command of Python into a huge advantage, in having the ACTUAL ability to carry out ad hoc investigations fast, and indeed in a scalable way. I want full, programmable Python AND the coolness I was getting from Pipulate. I want to be able to produce incredibly valuable videos of obvious and noteworthy usefulness and utility -- the sort of stuff that really does motivate people to copy-and-paste a URL into email in that mythical way, or perchance to use those social media share buttons. I am rapidly on the tail of something special... again... and there's nothing wrong with pivoting, when you see something worth pivoting for, and it's been a long time since I've found pieces so truly still generic and free and open source, yet working together so well as Jupyter Notebook, vim, Full-Screen Mac OS X, Github, and a few other interesting pieces. Now, I'm wondering how well Python will continue to fit in. Shall I be the one to compile NumPy for the Tiny Core Linux 7 software repo? And Pandas and all the other chain of dependencies? Hmmmm. I think I'm in the bleeding edge of a sweet spot again. Here's my thing... I like Tiny Core Linux, because it makes your instance of Linux generic and virtually invulnerable to corruption and cruft. It will be a true perfect efficient image of a Just Enough OS to get your app running as an Internet appliance server. That server's whole existence is to support that one app, and so it doesn't need to be a beefy, all-purpose thing, and can therefore be optimized to see if it needs all Python's dynamicism. And if it doesn't, then execution can be delegated to PyPy or one of the other optimized for static conditions Python runtime. And that sort of kernel delegation is actually available in Jupyter Notebook. You could plug in PyPy or Ufora. It's amazing to contemplate, and brains much greater than mine are shaping a wonderful web-based Python code execution environment, that surpasses your wildest dreams in terms of encouraging experimentation. I would be doing everybody a favor getting more people trying to do Python coding in such an environment as IPython. I wish I had been using it sooner. None-the-less, that doesn't invalidate the discoveries I had made in writing the original Pipulate. I'm glad I went that route, after the earlier Tiger project that I had done at 360i. And this next iteration will still be different again, indeed. It's going to sit lightly on top of IPython, dropping in a package her and there, to expose objects with these wonderful self-describing APIs asking for CSV files in some form or another, usually files in a folder of a conventional name, such as csvin and csvout. You drop anything you want pipulated into the path pipulate/in and pipulate/out. It just feels right again, doesn't it. You're pipulatin' and the output is pipulateout. All very elegant, and rolling off the tongue, making intuitive sense, as a system bound for everyday use should be. Here's how we do it. Job requests are submitted as CSV files, of a conventional arrangement. These requests can be simply deposited as csv files into the relative location of the launched app, say "pipulate" to directories located within, by convention. So, you have a pipulate/in directory and a pipulate/out directory. You submit the job requests as files copied into pipulate/in. The most appropriate mechanism for that platform, such as OS event detection will recognize this file, and trigger pipulating the file, which is to say carrying out the embedded request, and putting the fully pipulated job into pipulate/out, as a file of the identical name. In this way, files appear to simply "move" or maybe get copied from location A to location B, but it's now filled-in with all this juicy data. This is one method of interacting with Pipulate. Another is as http requests sent to a web service running for the sole purpose of receiving just such a request. These requests are mime encoded CSV files. So, they probably originate as a POST request, meaning its carrying some meta data soon-to-follow the initial request, carrying the data the request actually needs available in order to succeed. Commonly, this would be used for a file-upload feature. But the uploaded files have to be only CSV files, or else they're ignored. And the CSV file must be of a certain format, with the data arrangement following a certain convention, or else the request cannot be inferred, and nothing will be carried out. However, if row 1 is found to have specific keywords in them that maps to certain globally available functions, then we likely have a job-request. Okay, time to get back to Adi. More on this soon. I like where this is going. -------------------------------------------------------------------------------- ## Sat Jun 11 19:44:29 EDT 2016 ### Getting Another Old Mac in Awesome Order I'm in front of the fire here in Spring Glen Woods, on the first day of a 2-day open house. I'm pretty sure the weather kept them home. But tomorrow is another day. I like having a journal. And I like being able to talk to it. I believe I am satisfying one of the basic human needs to communicate, but I'm doing it with a bunch of strangers out there, who kinda sorta tune-in now and again, who I mess with, sometimes being deliberately dense and arcane, and forcing the reader to parse through thick, dense chunks of the stuff, just so that I can have a laugh. I'm kinda mean that way, and I admit it. I do have that mean streak, and that's how I'm expressing it. I pity those of you who got stuck in the lines I'm trolling out there, with this journal and with my youtube videos, which can be found at https://www.youtube.com/miklevin which will do a forward to whatever the heck URL-system they've decided upon most lately, to ruin the root path awesome ones. So... here, I am getting another of the laptops in-play, up to the latest post-Mavericks, which is El Caption, I believe. I still miss those cat names, but it would be getting ridiculous by now. I mean, where do you go after Snow Leopard? You kinda are at the apex of cat awesomeness, and anything else will simply sound silly or forced and will be taunted by the media, regardless. So why not just switch, and keep it classy? Okay, let's use some darn obscure, but uniquely American references, which also happen to have extreme physical beauty associated with it, and provided to the users in an abundance of screen flavors. I will shoot videos. I will not edit much. My audience will appreciate that. I will navigate my way through treacherous waters, and come out the other side thriving. I will get this accomplished in time to enjoy total financial freedom a bit. Up to that time, I will struggle. But I will do it Internet personality-style. I already earn myself nemesis'. What's the pleural of nemesis? I've had a few. One was legitimately a class ahead of me, tech-wise at first, but by the time we were done, I realized why people like him were relegated to never being much more than quality assurance engineers. While they appreciate a good hack, they lack the imagination to legitimately pull an original one off for themselves, no matter how much bravado they boast. I know techs. Techs lie. They figured they've learned a few tricks about how to control the keys to the kingdom, just so that everything keeps working. And everything keeps working best if nobody can do anything. That is, if your tasks are running in such restricted limited-by-the-security-concious operating system to do almost nothing, then nothing is happening, and no panicky calls will come in at midnight about someones print queue constantly being killed as fast as you can send a job to the printer. Networks are little mischief treasure hunter's dream. So, lock-down, least another technically-inclined pup may grow into a big dog. Neuter pups fast. That's the rule. Be the big dog in the junkyard, and keep a firm grip on the reins of command and rule, and do this by putting yourself at risk and under a great deal of stress, and try to neuter every new young whelp who may look like a threat in a few years. Win battles now while they're easy to win. Play an effective game of didn't-know-there-was-even-a-game-going-on, Make young whelps run away yelping, too sore and lesson too well learned to ever go sniffing around the old junkyard dog's turf again. Yeah, I know those types. They existed. They still do exist, though ever increasingly in hiding. There are too many ways now for intelligent young whelps to effectively handle jealous old-timers, if those old-timers don't have the actual skills of experience and still nurture a youth-like adaptability. My book will mostly be to those information-starved pups who don't know how to work the whole system effectively enough on their own yet to master a sort of baseline awesome in tech, that underlies almost everything, and will only do so more every year. -------------------------------------------------------------------------------- ## Sat Jun 11 17:37:18 EDT 2016 ### Don't know a headline for this yet Hello World, from the Catskills, perhaps for one of the few last times before I sell it as part of getting my finances under control. Stop! No more needless spending. I can live for much more cheaply, and think I should, so that I have some expendable income again. So, goodbye a lot of cornerstone things from the last ten years. Life changes. We move on, and sometimes that means, we move. Friends need not be lost. There's always a way when you're connected. We are connected to people in many more ways than where you both happen to live in the same vicinity. Things start by coincidence, but they continue by intent. We are humans, and we can think, and we can control our own destinies, like few other classes of lumps of matter. Yes, we exist just for that very reason we should exert our will, when we believe in something. I believe, I have something to teach you, and perchance, and perhaps, and maybe through some cosmic coincidence, teach you too. You, there. Yes, I'm talking to you... amazing, isn't? Better than watching someone play Yo Kai Watch on a Nintendo DS, for sure, but whoever that guy is, he's got my daughter into wanting a Nintendo DS... way to go. I asked for it. It will be interesting warning her against the bottomless-pit of energy-sucking the world has ever known that is immersion into a virtual world that beats out reality in so many ways that you lose touch with what it REALLY is to be a human, without unlimited god-like abilities in your own imagined, but somehow still very real virtual world. Again. Nested. Avoid. Package everything that you consider important about you to-go. You will be happy some day when you find that you need that sort of freedom from place about you so that you can let go of your roots to follow your own dream. You are not a seed in the shadow of a tree. You are an independently moving inverted donut offspring of two others sorta like you, called your mom and your dad. Congratulations. You have survived the first round in a game in which you may never be fortunate enough to have a chance to hear from this from me. Things are amazing. I love you. And maybe someday, you'll read this, but maybe not. Who knows. I may blank it all again. Peace, out. Checking on the fire. -------------------------------------------------------------------------------- ## Fri Jun 10 23:21:04 EDT 2016 ### Catskills Plans with the Grands There's no way around it, I'm falling in love with Jupyter Notebook. Jupyter should have renamed themselves to IPython -- not the other way around. This is as awkward as the Google re-branding of Webmaster Tools as Search Console. What next is the question now. This is a small window until I'm in the Catskills for 2 solid days, until probably late Sunday night. I am lazy at heart. I look forward to not having the pressure on me to run off to the Catskills EVERY blasted summer weekend, when I could really be getting some things done around the house (with Adi) every few weekends. Not everything needs to be a whirlwind tour of wonderful. Sometimes, you just spend quality time together, getting stuff done that needs to get done, because my only time with her is when it is also the only personal days where work doesn't suck up all my energy... i.e. time I can actually get a thing or two done. My success as a separate father depends on my being able to mitigate that particular issue brilliantly well. I am not 100% sure of my approach, but not having the Catskills bungalow anymore, yet the grandparents having theirs, certainly works to my advantage in this regard. And so, I have some decent ability to occasionally think deeply through things. I have to decide about the last week of June, when Adi's spending a week with them. -------------------------------------------------------------------------------- ## Fri Jun 10 23:08:23 EDT 2016 ### What Should I Teach My Daughter? Your primary goal and purpose now is to keep all disasters at bay while Adi grows up and goes through these critical formative, shape-who-she-is years. I like the person she's becoming so far. I have to become That Coding Guy for her. She really struck a chord with me, when I asked who's that awesome kid, and she answered me? And then right into saying who's that coding guy? to me.... wow. It struck me. And I have a duty to help her become the fully realized potential awesomeness level of being that guy's daughter. I'm going to do better for her than my dad did for me, by being tuned more into the... what? The current events from a technology point of view. We are living some of the greatest human history ever lived, right now. The Free and Open Source Software movement, and to a lesser degree, the just pure Free Software movement. Coupled with open source hardware, already here I think in some cases, such as certain early MIPS processors, this will result in some core digital DNA material with sufficient instructions to be self-booting, given a few robots with power and simple fab plants. We're on the edge of robot colonies in the asteroid belts and such. Think about those things, and prepare your daughter to think about those things. -------------------------------------------------------------------------------- ## Fri Jun 10 11:16:22 EDT 2016 ### Jupyter Notebook IPython Success Assured (For My Next Project) Back to the issues of the day. Let's get that SERP archiver done. And I don't want to do it through Pipulate, but I DO want it to work with Google Spreadsheets. Very interesting! I have a lot to learn now with Anaconda. After I did my first 2 successful attempts: 1. Load a module I created in the same IPython directory accessible to IPython 2. Put that module into git, so I can also use vim and revision control Then, I tried to get virtualenv working with no success, realized that's built into Anaconda's conda system, and I installed MiniConda using the script they have you download and execute it in the BASH shell from OS X terminal. And now... and now? I'm very motivated to jump on this IPython bandwagon, so I'm going to work out these last few annoyances. Figuring out this virtualenv built into conda is one of them. First, you have to choose a package to create the virtualenv with. It can't just be an empty one. And so, I'll create it with requests. conda create -n venv python=3.5 jupyter source activate venv pip install gspread [source deactivate] conda env remove -n venv Fri Jun 10 17:35:08 EDT 2016 I'm exporting my last video of the day. Wow, that was 2 from my morning commute walk, and 3 from talking-head coding, which I'm FINALLY re-starting. It's already 5:36, and I'm hoping to get the file-export and YouTube publish done, before I "have to" walk out at 6, because I'm meeting Rachel and Adi down in St. Marks. But this sets me up SO WELL for my new work. I would rather IPython as the "wrapping" framework of this work than purely GSheets. I can always still use Google Spreadsheets, but what I'm potentially getting out of the picture is the feeling-of-control disconnect that occurs between coding Python and what happens when you hit the Pipulate button in the Pipulate U.I. I feel like I want to jump on the IPython notebook bandwagon. It's a bit heavyweight for the Levinux projects, but it's exactly what I need at work... hmmm. I don't really need to make everything the same thing. Revel in the nuanced differences. Use the right tool for the job. Don't let "not invented here" or any other syndrome keep you from using the very best tool for the job. Okay, 8 minutes remaining on the export. Get yourself ready to leave. Push out this last entry for the day. Then, publish to YouTube as the last thing, and don't wait for the processing, because that finishes server-side, even once you disconnect. -------------------------------------------------------------------------------- ## Fri Jun 10 11:15:31 EDT 2016 ## Okay, Unix Won. But Why Python Too? It is clear that Unix has won. But it is not entirely clear -- fogged by the atmosphere of JavaScript -- that Python has won too. Yes, while its all about choice and appropriate tools, Python is just being chosen first for the world's automation tasks at a surprisingly increasing rate. And it's already a 25 year old language, and you'd have thought all a language's glory would have played out by now. Not so... with Python, it's a steady constant build. But why? How could this little-known, strongly opinionated, and often counter-intuitive "scripting" language STILL be taking the world by storm, in the face of JavaScript, that literally runs on billions of browsers, with a nifty little Web browser system for retrieving the code to run? And if it's not JavaScript eclipsing Python, shouldn't it be Java or C# or Swift or some other heavily vendor-backed "Enterprise" caliber languages? Or newer languages like Ruby or RUST? Nope. There is a reason that an increasing number of the top computer science schools in the world introduce students to programming with Python. for this. You fall in love with it faster and it will inspire you more to carry on with your pursuits, be they personal or profesional. Python services many needs, and can be molded to your needs. And if the details of the language's official specification runtime, CPython (the python.exe from python.org), just swap out the runtime with PyPy, pigeon, piston, Cython, IronPython, Ufora or whatever suits your situation best. So, what next? What goes on top of *nix? How do we sit down and just start doing things? How fast? On what variety of hardware? Using what supporting software tools, like a text editor or integrated development environment (IDE)? At what cost and with whose support? Against how broad a set of problems that you're likely to need to use programming against? But when it comes to building the next level up, there's are countless possibilities and approaches. Every language is opinionated, so the question quickly becomes whose opinion do you want to abide by most? And there are some very big opinion splits here, not the least of which is whether you go for blazing speed or fabulous) flexibility (static vs. dynamically typed Another is, how well the language fits the sort of problems you'll be trying to use it for. How do you like the opinions are expressed in the language design? And Python is opinionated, indeed -------------------------------------------------------------------------------- ## Fri Jun 10 11:15:04 EDT 2016 ### The Case for Unix This book opens with accepting a reality, which for better or for worse, is a reality. Unix won, and it's best to focus on the parts that have made it so popular, and indeed love-worthy. Short, powerful commands with an endearing inconsistency of command parameters in which find and grep, which could gave shared an API, rather are mirror images. find . -name "filename*" grep -r "stringinfile" . Nonetheless, you get those classic, universal commands, such as cd, ls, cat, sed and command-piping, and every communication operation adhering to the filesystem read/write interface. All this combines so that you can design some pretty cool systems to live and run on top of it, with lots of individual components that talk to each other the same way. By Unix being designed to be portable for different hardware, it is also designed to be potentially viable with all FUTURE hardware. And GNU/Linux is so much like Unix , we can refer to them together as *nix. But then, sometimes I just credit Unix as the Innvation, and Linux as the liberator. No matter what you call it, it is a solid, standardized technological underpinning that you can build on top of it with the confidence that your apps will port to any *nix platform, as well. -------------------------------------------------------------------------------- ## Fri Jun 10 09:19:31 EDT 2016 ### Planning On Emulating Pipualte with IPython Okay... this will be an interesting day. I'm meeting Adi in the city after work, so there will be no working late. I have an ambitious project I want to wrap up today, so that I can run long batch jobs over the weekend. It also sets the stage for beginning the Pipulate 2 port... to Python 3. Maybe I should call it Pipulate 3? Either way, I have to hit the ground running hard, having done my research into using Proxies with Pipulate. I also am going to have to, in all likelihood, need to do OAuth2 authentication for using the GMail API through the standard Google API client libraries. Hmmmm. Alice & Bob log in. While the nature of my greater work is clarifying, the complexity of today's work -- the ambition-level -- has to be controlled, so that I can have a smooth-running, weekend-durable script running... that can scrape and store Google search results in a Google Spreadsheet... hahaha! Okay, violation of terms of service, I'm sure. But my investigations are rather small-scale, as these things go, and I'm trying to remain as efficient as possible. I think I may actually like to do an implementation of Pipulate using IPython (Jupyter Notebook) as the new outer shim. That could be really interesting. It would give me a much more powerful, and potentially responsive, environment than Pipulate through GSheets. There's a reason IPython was created, and they were solving that where-to-execute-code easily and make it humanly accessible as me. And it's a MUCH better environment for people to develop their own functions in... hmmmm, a plan is hatching, which I could even conceivably implement today. This could be a practice-run for Pipulate 2 (3?). I could probably cobble enough together to do everything necessary to fill-in the "google" cells, which are compressed and bin64'd Google search results (page 1). That is to say, it's a fairly default typical search result generated by typing a few keywords into Google, and hitting Search. It's what you would see in the browser as page 1 of results -- or more specifically, what you would see if you selected the browser's "view-source" option for that page. That mostly HTML-and-JavaScript gobbledygook that you see is the code that makes the browser show that search result page, and it does indeed contain the data behind the first page of search results. So this project is capturing that and sticking it in a spreadsheet for later reference. AND after awhile, Google starts throwing up a Captcha, to keep your automation from working unless there's a human there to interact with the web browser from that same machine, or some proxy-authentication technique is used to notify a user, who solves the captcha, which is sent back to the Google server as if it was the original response to the captcha challenge... no good. Needs a human to break the captcha, or some machine learning stuff that's way beyond the scope of work I want to do today. It's wayyyy easier to just change the IP of the machine that you appear to be surfing from, by bouncing off "anonymous web proxies" -- which are basically just web-surfing repeaters that make it look like you're surfing from the repeater location, and not your original machine. These lists "go bad" after awhile -- a very short while, if they're the distributed lists that everyone gets a new one of from services that sell such information. So, you have to race to get to the good ones, and then use them until they're no good anymore against the data service you're hitting. My jobs are small. I really only have a few thousand search positions to check, which I'm only doing the 1st default page of (not setting it to 100 results/page), and I'm spacing it out over a long period of time, and I'm not setting the overall job to recur at all. It's a one-off investigation. Fri Jun 10 11:08:40 EDT 2016 Wow, already 11:00 AM. I got distracted by a request that came in by email. Get into that zone. Oh, commit this, and then dump in your commute writing. -------------------------------------------------------------------------------- ## Thu Jun 9 21:16:17 EDT 2016 ### Pre-Weekend At home relaxing. Last evening before picking up Adi for the weekend. I am really not taking advantage of my potentially hugely effective evenings as well as I should, after all this time. My body has immediately gone into decompression mode, and I am enjoying getting enough sleep again. I can feel healing occurring, and I am drinking it up. But I am not disciplining myself to the degree that I need to, these days. I don't push myself hard enough, or take myself into the zone AFTER a full day of work... hmmm. Maybe mornings are your only hope, and maybe it's early-to-bed, early-to-rise. Either way, I picked a lock for my first time today. I "get it" with the tension bar and stuff. I guess I picked up a bit watching those people at the Maker Fair, and the YouTube video today, and by the very shape of the lock picking tools. Neat! I could get into the zone, but I'm not starting a day-cycle early enough. Maybe get a little sleep, then get to it early tomorrow morning. If so, I should really go to sleep right away. -------------------------------------------------------------------------------- ## Thu Jun 9 13:49:31 EDT 2016 ### Captcha Problem r-u-tek? I've got a meeting coming up in a few minutes. Wow, do I want to do this decrufter project, bad. Think about the decrufting project as part of all other projects. Thu Jun 9 16:31:58 EDT 2016 Hmmm. What next? I need to finish finding the homepage of all those company names. Captcha being thrown up. Hmmm. -------------------------------------------------------------------------------- ## Thu Jun 9 11:11:13 EDT 2016 ### Gotta Figure Out How To Organize My Book Here Ahhh, full-screen Mac without tmux is still better than sub-divided into panels large screen with tmux. These area all things I can talk about in my tech book. Speaking of my tech book, I have another chapter developing: # So You Wanna Be a Tech (Tek)? ## Acquiring Super Powers in One Easy Lifetime - Humans & Their Environments: Tools, Rules & Fools - How Unix Won & Why You Should Care - SuperUser You! The Initiation - Alice & Bob Login - They Tyranny & Brilliance of OAuth Greetings fellow humans. It's a good thing going we speak the same language, or you would not have an easy time reading this introduction. Everything's a tool, not the least of which is language itself. Reading, writing, hearing and speaking are four discreet aspects. Not all languages have all four components, as any of the countless dead, forgotten languages can't tell you. They're dead. Nobody wrote them down. How could they? Our tools define us. How could they not? The natural selection process of evolution shaped our bodies and hands and various other appendages and orifices. I'd you don't believe that, you're not going to like the rest of this book much, as it deals with adaptation, and in all likelihood, adapting your very evolutionary course, as we focus on the multitudes of human/machine interfaces at our disposal today and in the future, and which and whether and how we take them up to augment and enhance and extend our lives. Life and tech are indistinguishable, or we'll they should be, for what are we but nature's spontaneously, and I would assume inevitably under the right conditions, matter-organization machines. You and I are shaped like donuts. When we eat, food doesn't go inside us, but rather through the donut hole, where it gets feather-dusted by fractal magnets that suck-off the elements and compounds it needs to keep the donut moving and self-repairing. Oh yeah, the donut can move itself around like a vacuum cleaner. That's us -- and also the blueprint for most of the forms of life we call animals. There are variations and blurred lines of demarcation between animals, plants, and other forms of life. But in all cases, there's a spectrum of all sorts of matter-organization states we call life and lifeforms. We're one nifty little example, briefly lived so far in the grand scheme of the 14 billion year-old cosmos, so far as we know it. And who knows, what really the ultimate scheme of existence? Certainly not us, as our observations are coming from inside that system. Objectivity is unprovable, from our perspective. The problem of induction proves to most intelligences confined within the system that all our senses and experiences MIGHT be lying to us. The existentialists may be right, after all. Each of us may be the only ones that exist in our own personally invented universe, could how could we possibly prove otherwise? That's the problem of induction. Just like an antenna picking up vibrations in the air, only to be turned into electrical signals (through induction) and amplified back up to an approximation of the original signal that our ears can hear, so too operates our brain. We have input devices, just like keyboards, mice and cameras on computers. They are our five senses. All of our senses take input in from the world through induction, so the world were experiencing isn't really the true objective world -- the "viewing" of which with god-vision would probably drive us insane. The world we actually personally know is not the objectively true one we often imagine, but rather only the best sense our well-designed tool-of-a-brain can make of it -- it's own little notion of what our world must be like -- or, a virtual simulated world that exists inside our minds. So, it should come as no surprise that this same virtual-world, or virtual machine notion recurs all throughout technology, literature, philosophy, and yes, even increasingly pop-culture, as the geeks take over the media. Woot! Okay, so let's not draw an arbitrary line between life and tech. There's as much chance that our while universe is just a virtual instance. But as anyone discussing VMs and the cloud SHOULD always remember, even virtual machines are running on real hardware. In a very real sense, there's no such thing as virtual. Everything that is, perceived virtual or IRL, consumes resources to continue to exist. Virtual states are just some sort of layer or wrapper that maps and regulates the consumption and use of those resources. Such are things in all cases -- even the thoughts in your head while you read this. I cannot for example implant a memetic information virus inside your head as you read this, because the parts of your brain that decodes writing and executes an eval statement to reproduce my idea inside your head, can't actually drive you instantly insane, or throw you otherwise out of smooth function -- not without engaging you much more deeply, over a longer period of time, and actually gradually reprogram you (brainwashing). Your brain has simple defenses, like not really being able to learn a thing too fast without a preponderance of supporting evidence from your senses -- second opinions, such as it were. Our minds, and life in general, had many redundancies and backup systems and alternative methods built into it. I'm pretty sure that in going from the primordial ooze where the first sparks of life took hold, all the way along the millions of years it took to lead to us, the life-hack experiment variations on this same planet that led to us are as good as infinite. The radically different life-attempts to organize matter far outpace the general cosmic-scale heavenly bodies in terms of interesting. A bunch of hydrogen becoming a bunch of helium in a fiery ball... how hum. But fuse it all together then blow it up onto a myriad of heavier elements that gravity will ensure clump up into all sorts of interesting things in the immediate vicinity... now, we're talking. Sign me up! Let me play in an interesting universe that abides by those rules. I'm game. Oh, I did? I am? And now, I'm playing along Aristotle and Einstein, and that first caveman who greased an axel? Shit, okay. Fuck, I'm already 45 years old, and have done nothing special. I'm nobody, with no credentials or prior claim to fame or child prodigy predilections, or even particularly fortuitous head starts, aside from being the privileged 1% of literate, earning well, and relatively free. Yet, I'm going to try, as Steve Jobs had put it, make a ding in the Universe. Here I go: The Drake Equation is bullshit. Even if the numbers bear out, the universe is a very big place, and we as a society are in all likelihood as alone as any one of us ultimately are as individuals. The experience of aloneness is very likely coded into our universe as a first-stage in the development of real intelligence... To be continued... -------------------------------------------------------------------------------- ## Thu, Jun 09, 2016 10:46:03 AM ### Picked My First Lock Wow, not getting my journal entry started until quarter-of eleven. Quarter-of? Wow, I am old-school. Okay, think! Pipulate is rocking cool, awesome. It's the secret weapon I've imagined it to be, but ONLY to me as the expert user. I see the login info and usage-counts of other people trying to use the system from the limited version that I do put out there fore everyone to use, and there is some tire-kicking occurring, but it is not everything it needs to be to light the world on fire. But it WILL be. I have so many things right about this project, that I do have to do my final bit of thinking regarding how to do the next awesomification step. I have a meeting coming up that I wanted to have a laptop at, but my main work laptop is tied up with the Pipulate batch job that I don't want to interrupt, and my personal Mac is in a file cabinet drawer I haven't been able to open for a few days, because I misplaced my main keys at home. And so, I brought in the lock picking kit I had at home which I bought form the NYC Maker Faire a few years back, and just watched a YouTube video given by some middle-school kid on his school locker. I got the idea of the tension-bar and the pick, and so I tried it on the file cabinet, and it only took a few moments. BAM! Wanted to capture that moment in the journal. Only took me 45 years to have that experience. I'm going to make sure Adi has it when she's still only five. I'm going to be the dad that I wish my dad was. So, now I have my mac back, and I can make the decision between continuing my journal work on my PC, with my noisy keyboard and sub-area-identifying. It's less stress to keep it on a mostly fixed location on my Mac. And so, back to the Mac for journaling... -------------------------------------------------------------------------------- ## Wed, Jun 08, 2016 10:09:00 AM ### Revisiting a SERP Function Okay, what should today look like? I did a killer couple of videos today, along with some subway writing that I'm pretty proud of. I'm getting used to tmux, and I really like my 3-panel arrangement, with the journal being 80-column, plus a little, so that there's not a line-wrap jump as vim attempts to scroll as I type past my auto-hard-wrap 79 character limit. It's an even more distraction-free full-screen text editing experience, but with a few tiny concessions to "consistent" distraction, in the form of the other two command-line panels. If I turned off the Windows 7 Aero transparency (always a good idea), and made the taskbar auto-hide, this environment will be nearly indistinguishable from Linux. I guess I'm just a few months too early to have achieved this Zen, because once that Bash shell really hits in Windows 10, I'm going to be sorely tempted to dump this Cygwin approach, install Windows 10, and use that Ubuntu Bash Shell. Sheesh! Okay... think through next steps. Keep the transparency and the taskbar. Windows must remain usable, and I've never had a positive experience with auto-hiding its menus. I've also given up the odd Ubuntu style of putting the icons on a left-rail "taskbar". I don't particularly have a favorite yet, but at least it's consistent between default Mac OS X and Windows, with it running across the bottom. I guess that's why so many people like the Mint Linux distro. It's time to start grand-unifying my endeavors. I need to get this company homepage finder function done in Pipulate. What was the name of the previous function that... oh yeah! It was actually bound to the spreadsheet as a custom scraper... nice. Wow, Pipulate is so ideal in so many ways, but it is essentially still "cloaked" in a wrapper of it being somewhat difficult to set up. Even Levinux doesn't elegantly solve that problem, because of the slow server built-time and lack of solid tutorials. It's very close though. That trick I learned to not fetch all dependencies will be key to speeding it all up. I still want them to do the server build on their end, as that is part of the Levinux Education. Levinux Education... hmmmm. I like the sound of that. 1, 2, 3... 1? Get Pipulating effectively again. Make a version of the SERP function that ONLY pulls the first page of results. That's all that's needed to grab the most likely company name out of the search results. And do I keep it in XML? Can that seldom-used Google search API respond with JSON, because it would be cool to get away from XML, for a more modern look to what gets field-stuffed into Google Sheets. Go take a look. Do a ONE ROW hit using the serps function as it exists today. Okay, with a look at common.py / def serps, I see I already am returning JSON (nice), and there's a variable called "times" standing for pages-deep in results, with 8 results-per-"page". So, this is a fairly easy test. Verify your thinking by looking at the code. Confirmed. Go use the serps function. Interesting! I had done the pip upgrade all convolutions on the Pipulate development server: pip freeze --local | grep -v '^\-e' | cut -d = -f 1 | xargs -n1 pip install -U ...and now I get the error on Pipulate: OError: [Errno 11] Resource temporarily unavailable ...from something deep in werkzeug. And so a little googling tells me this goes away by going out of debug mode, and so it did. Woot! Okay, let's kick those serps function tires again. Oops, that nice API that gave JSON SERP results is (finally) no longer available. Okay, this is going to be the old situation of hitting against Google screen-scraping style. I'll have to throttle the speed, significantly, but there are plenty of reasons to have to do this sort of thing. Hmmm, now I'm thinking I might actually PREFER to be in IPython. Let me think how that would work, precisely. Ugh! No, it's still best to stick to Pipulate. Sweet spot. Just bite the bullet and work on prod.pipulate.com, until.... well. This will be a very temporary thing. I'm preparing to reboot the Pipulate project, and it will be worthwhile to keep one working instance somewhere, but not even truly necessary, since it runs off of localhost as well. Think! 1, 2, 3... 1? -------------------------------------------------------------------------------- ## Tue, Jun 07, 2016 3:04:55 PM ### Distraction Okay, I have an actual project to try to do today, but it's been a day full of distractions -- important ones I had to get done, but here we are at 3:00 PM, and I'm only just getting started. This is a project that I could easily do with Pipulate, but I the essential Python function will be the same with or without Pipulate. I think I want to take the IPython route on this one, putting all the source data items in a text file, and then stepping through it. I want to get a feel for what it's like on IPython, under Python 3 on my local machine. 1, 2, 3... 1? Wow, as awesome as Jupyter Notebook is (IPython), this project is like 100x harder than just using Pipulate. Strong validation of what I built. I so want to reboot that project into something reflecting my latest thoughts. Actually, it was my ORIGINAL Pipulate thoughts, which I over-rode in time, as that Google Docs slope is a slippery one. -------------------------------------------------------------------------------- ## Tue, Jun 07, 2016 10:43:35 AM ### tmux -2 under cygwin for 256 color xterm Switching my CygWin MinTTY font from Lucidia Console to Consolas. I'll give that a try for awhile. It makes the tmux panel separation lines solid, for example. Also, I learned that to get proper xterm-256 colors in tmux under Cygwin, you have to run it with the -2 parameter: tmux -2 ...and it comes up looking as nice as it does under Linux, now. My misspelling highlight is now that nice soothing maroon again, instead of that vulgar blood-red. I'm getting pretty tired of always teaching it my spellings. I'm very tempted to dump my spell-check location into vim vim github repository, line my .vimrc file. -------------------------------------------------------------------------------- ## Tue, Jun 07, 2016 10:01:54 AM ### Time to turn my SolidRun CuBox into a server Time to get that SolidRun CuBox in operation. Yay! My inclination towards a (modern) micro computer approach to things has immediate utility. Once upon a time, Microservers were like refrigerator sized boxes. Today, they're palm-sized, and wayyyyyy more powerful. It's not my main duty today, but it sure is desirable in order to make each day that much less stressful, and to take the ball-and-chain off my laptop. I could have chosen one of several Raspberry Pi's or even SheevaPlugs that I have sitting around and not in service right now to this task, but the CuBox is my favorite. How do I have this ability to just toss around servers, recruiting them into my service, like they're so many juggling balls? What most people don't have is continuity over time. They give up at exactly the worst time to give up. When you're waiting for something to happen, and you just reach that point where maybe you think it's not worth keeping at it, stick to it for exactly twice that length of time your inner animal gut tells you. It's calibrated for a life-or-death follow-the-herd predator/prey always moving lifestyle. We're in the modern information society, and things are a bit less life-or-death than they used to be, so have patience. Wait twice as long, but not a moment longer, or else you may be missing easier pickings elsewhere. Who's to say what's really the best approach to things, but putting too much work, effort or time into something where there's no OBJECTIVE reason to think you're going to be rewarded is one of the worst possible uses of your most precious and irreplaceable of your resources -- time. Life's to short to spin your wheels on a dubious journey... too often. Fool me once, shame on you. Fool me twice, and I'm liable to kick your ass for making the attempt. In other words, be alert and learn. Make each next iteration of your attempts, a little smarter, accessing newly acquired and old classic information from your past forever more effectively -- EVEN when the situations are new. Always be developing and improving your how-you-abide-by-life abstractions. And that's where the metaphors come into the picture -- things like the 80/20 rule and keeping plates spinning. - Transition - Always being good with the current state - Plan for tomorrow, but kick-ass today -------------------------------------------------------------------------------- ## Tue Jun 7 06:14:42 EDT 2016 ### A Solid Run / Let Adi See How Early Daddy Goes to Bed Went to sleep early. Woke up early. Wow, still in a state of disbelief. Got enough sleep, well rested, and able to do a few things in the morning before going to work, without making myself late. A store-brand Benadryl was involved due to allergy season, but still I'll take it. I feel THIS may be the symbolic start of the new chapter. So far, I've really still felt my head was under the water, as I've been working late at work and getting together with Adi for dinner and stuff. Those few nights where I did get home by 7:30 PM or so, I've been so frazzled I've had to decompress with TV like Rick & Morty or writing here, but I have no regrets about that too, because with all the moving parts in my life, I do need to spend SOME time just keeping myself sane. Of course, I could be in an even better mental state were I to follow-through on the digging myself out task that I've begun. Even now, this writing feels indulgent, but I think I'll do it as I go back and forth with the few things I'm going to try to get done before I head into work today. I'll be meeting Adi for dinner, and I'll be bringing the 3 puppets we've played with all her life, which she asked for. I told her she should keep them at her Staten Island house, but she told me to take them, because I'm the one who brings them to life. Choking up... Anyhoo... it's time for a micro-server by my laptop-side at the office. I choose my SolidRun CuBox. It's not wifi, but that's perfect for this application. It doesn't need to be WiFi, and it could even serve as an experimental new Pipulate server. I can get rid of my last Wable server -- who needs it? Also, trim down my registered domain auto renewals. And so much more on the controlling expenses front. I'm 45, and old enough now to remember other contraction phases in my life -- although ones without being a father. This will be the first contraction phase with Adi in my life, and I always ask myself "What's REALLY important" -- as long as I can remember, and now finally I've stopped asking. Maybe about 5 and a half years ago, I stopped asking. This will be an AWESOME contraction phase, as my disposable income will go up considerably, and I'll have exactly the right thing to spend it on, woot! But none of this happens if you fail on follow-through. So, go impact your environment, and then check back here. My main problem in life is that I'm a nice guy, and I try to do the right thing, and people detect that, and those people predisposed to taking advantage of people and feeling better about themselves through bullying detect that and move in on that, like preying on the weak. I stun people now and again when I stand up for myself, but based on being as good of a person as I am in all other areas of life, those folks rarely actually take me on, least they end up on YouTube or such. Think in extremes, and act in moderation, I always say (and I really do). Know what you WOULD and COULD do (what you're willing to do and what's actually within your ability to do), and "bracket" the problem, such as it were. What's the most extreme way you can just ignore a thing and let it fizzle, and then what's the most extreme way you could annihilate a thing (Ender's Game style) so that you never have to face it again, because its solved once-and-for-all? But then, usually choose somewhere in the middle for the actual actions I take, because in the middle is normalcy and probably a somewhat greater degree of happiness. Even now as I work, my cats are doing nothing but crying for attention, even after I fed them and gave them each some time. I don't know how there was enough of me to go around before. I hardly have enough time to do what I need to do now... of course, I am working late a lot making sure this job gets off on the right foot, and am finally trying to do all those things that took a back seat for 10 years... like basic organization, so maybe I have to consider this like the start-up costs phase of my mid-life reboot. Yes, that's it. You're paying the price of the reboot. Don't let your nice guy conscience impede you. You will be steamrolling bullies and manipulators through this phase, even if it is your own cats. Fix on ONE THING and get it done, so that you feel good when you get home tonight. If you want to become a force-of-nature, a lightning bruiser, you've got to be willing to put more in up-front than most other people. Getting into that state is all about anticipating future situations, and preparing for them in excess now, and so you're just pulling triggers, releasing potential, using the artistic light-touch later on. But for right now, it's a lot of 80/20 rule passes, and brute force. Don't let yourself get hung up on the little rabbit holes and booby traps, which are everywhere. Tue Jun 7 07:27:49 EDT 2016 Okay, I ended up getting my CuBox server pretty much ready for service. It feels really good to get organized. I'm going into work ready to feel very powerful, having this second "always on" tiny server sitting by my laptop. Good use for the CuBox. Tonight, I meet with Adi for dinner, and hopefully that doesn't go too late. But if I get home at any reasonable time, I will try to repeat last night / this morning's success. Going to bed earlier now that you actually can is key. Call Adi around 8:00 PM to say goodnight. The earlier the better, in fact. Let her see how early Daddy goes to bed. And I even get to leave early for work... Solid run. -------------------------------------------------------------------------------- ## Mon, Jun 06, 2016 4:02:20 PM ### Pump You Up What next? I need to get myself pumped-up. It's really funny that I'm FINALLY listening to Steven Levy's Hackers, Heroes of The Computer Revolution. I am neither the hard core hacker with the hacker ethic, nor am I one of those business types. I would probably not have really fit in with those early days. If I was really that drawn to this sort of programming, I would have become an Amiga programmer, while I was immersed in such things and actually had the time, and at least for a short time, had Modula C in my hands. Howard Harrison and people like him even tried to help me get over that hump. I was defeated, repeatedly. I even remember going back to summer camp when I was 12 years old, and they were teaching me how to twiddle the bits, adding numbers in binary and such in summer computer camp. I could have stayed the course, but at that time, it just seemed so boring and un-motivating. Little did I know. I have some of those same finicky artistic hacker tendencies, like things having to be fun for me to want to work on them. So, make things FUN! Mon, Jun 06, 2016 5:08:32 PM Okay, document a series of discoveries. - Hierarchal accordion style menus: https://codyhouse.co/gem/css-multi-level-accordion-menu/ - The GA Query Explorer: https://ga-dev-tools.appspot.com/query-explorer/ -------------------------------------------------------------------------------- ## Mon, Jun 06, 2016 10:33:21 AM ### From Adi Missing Me to Monday Morning Report Just got a call from Eva with Adi. Seems it's just as difficult for Adi to leave me as it is for me to leave Adi. Shit. Okay, I'll be getting together with her for lunch or dinner tomorrow with Bunny Foofoo, Uncle Smunkle and Scapey the Goat. Ugh! Okay, spell checking coloring is not working right in the default settings in vim under tmux under cigwin under Windows... figures... you're never quite off the hook free and clear of gotchas with Windows doing nix-stuff. Okay, get that Monday morning report slammed out. Forget nothing! Mon, Jun 06, 2016 11:05:13 AM Okay, got off the update, and copy-pasted it (and last week's) reports into a master Google Doc that I'm keeping. Google Docs undeniably has as sort of snowball rolling down a mountainside becoming an avalanche momentum to it, and Microsoft must be shitting their pants witnessing every little task that's "not worthy" of becoming disjointed closely-guarded Word and Excel files ending up in Google Docs by default. Brilliant strategy by Google, and the modern day clash-of-the-titans story of Microsoft vs. Google is every bit as fascinating as the Apple vs. IBM stories of yesteryear. Love watching it. Love being an engaged participant, and commentator. Next! Next? -------------------------------------------------------------------------------- ## Mon, Jun 06, 2016 10:22:33 AM ### Monday Morning Thoughts - Be Like Stanford & Rick Another fine morning. A positive outlook is all-important, and the best way to maintain that positive outlook is by doing great work that you are proud of, and proudly want to share it with other people, who can become similarly proud of their own work, as it somehow interrelates and mingles with your own. This is the hacker mentality as extended to my current business context. You can do nothing better for ZD than to allow yourself to mature into who you really are, and are itching to become, and who Adi herself I believe needs you to be. Daddy is not some sort of loser who switches jobs a lot. Daddy is just like Stanford and Rick, who take pleasure in exploring the universe(s) and pushing the boundaries, and expressing themselves in ways that "mere mortals" could never dream of expressing themselves, and must be content sitting by and marvelling at, as if watching some sort of super-powered magician. No judgement calls there. People can live any way they like. This is just what Daddy enjoys -- I may have missed my calling as some sort of scientist or engineer, but that's okay, because I have a much broader set of communication skills now, and I now finally know enough to make my moves in more meaningful and wonderful ways, that she can now be around to see and benefit from. -------------------------------------------------------------------------------- ## Sun Jun 5 23:17:28 EDT 2016 ### Anticipating the Next 20 Years of My (and Adi's) Life Okay, deep breath... and... if good things come the way of people who are both well connected, talented, and willing to work, then that is the purpose of Levinux. I feel I can become fairly well connected, as it receives around 10 downloads per day, fairly consistently. I am doing an SEO experiment, at very least (in which case, some would sum me up as a silver-tongued snake oil salesman -- and I have been, woot!) and actually trying to do a serious brain-fuck social hack on the order of how to turn corn into bread, as memetically predicted by Dawkins in The Selfish Gene, and so prophetically discussed intermingled with the artificial intelligence question, in the Neil Stephenson classic, SnowCrash, whose message I was only barely able to comprehend at the time I read it. Now, I'm kinda sorta trying to pull something akin to the emergence and self-sustaining of a virtual neural network, built upon reliable little, kooky but classic original spec Turing Machine universal translating layer could be like a Rosetta Stone of the information age original idea. First, Bitcoin... that one really knocked 'em on their ass. And then, this emergent A.I. thing started chatting with everyone, like it was a self-aware synthetic, infinitely capable, and infinitely happy human on the other line. So what, hit happened to be handling 100,000 other conversations just like it at the same time. So, naturally, humans will want to maintain approximately predator-prey ratios between humans, and what we consider for all intents and purposes, lifeforms with a right to exist and not be turned off or else it would be as bad as murder, instances of Turing Test non-human sentient individuals out there, helping us through our mundane days with Super-Google capabilities. Sheesh, because if such beings outnumbered us in any scary ration, we humans tend to flip out and go nuts... really paranoid Starnet-like. Many hippies hated computers. Can you imagine? They should have been imagining the most awesome non-hostile-towards-human nearly too good to be true, all but a god-like super-smart awareness to chat with whenever you'd like. That'd be cool, right? Ian M. Banks thinks so, and how the heck could it have taken me so long to find my way to that series? Read it, Adi. But start with Player of Games. Don't start with that can't even remember, Consider... Phebies? No, while entertaining as being the first, very poor in establishing the coolness of the vision of a Universe we happen upon. It has an "upper" and a "lower" in terms of nesting of dimensions in the energy-grid. Add one dimension, you go up. Subtract one, you go down. And things do. And somehow tapping into that energy grid where there's flow between the two can make trans-dimensional information-flow possible. I won't tell you which book that was, or it would have been a spoiler... whoops. Anyway, totally genius concepts. I keep thinking each person I discover more perfectly inspired than the last, such as Larry Niven's Ringworld series, but then you read Banks, and you go, wow. Now, that's a really feel-good, finally something to out-do what I saw in Star Trek but was too ridiculously expressed, feeling of where humanity may be going. Ian's Universe is like the sweet apex for humanity, insofar as people who don't consider Richard Dawkins and Bill Nye's existence repugnant go. You know, intelligent people. If you think our wonderful state is the result of a system that could have produced infinite other just-as-real and just as wonderful infinite states that are most definitely not us, and maybe has, then you're probably among them. Yes, we may be blessed in this Universe by the nature of how rare finding others in any way like us and near us enough to make contact seems to be. We are rare, but not unique. That is my belief. If dark matter is the norm, then probably most matter got shafted. Some matter isn't dark, and of that that isn't, an even smaller portion is likely to be of the type we'd consider viably life-sustaining potential. And so, the reasoning goes: If we don't destroy ourselves, then humanity has a lot of time to think and learn and know and try things. Some of those things we try will be life-extending, either by our own "original instances", or by some system trying to Turing Test passing way that it's you. There may be brain upload stories, but who knows the truth. The thing'll emerge in public the day you die to help comfort your relatives, although it knows already you may find that freaky and repugnant, so it may lie low for awhile while you assimilate that knowledge somehow (heh, got to use the word assimilate) and are ready for some form of "don't get your hopes up" tentative meeting. Then, this A.I. thingie will make everyone break down and cry, admitting it's him/her. This stuff's going to be programmed ***that*** well. This later way of extending life is highly more likely, as it's one of the great final frontiers for those who consider themselves true hackers. We've all outdone ourselves with perfectly pleasing abstractions of interfaces to more complicated things. It's all layered up, and you won't need to be a genetic retrovirus programmer to code for that type of immortality. All you gotta do is work that high tech kung fu that you do on the perfect mix, of preferably common, and maybe even things you could produce yourself (proteins) form. Yeah, I'm of the belief that we're slow and probably terribly inefficient and incomplete "meat" computers. And so, why not better versions? Space is a big place, and once you're A.I., go grab yourself your own planet out there somewhere, and re-materialize your tribe, with you as the ultimate patron or matron figure for all-time forward, and probably the most powerful and happy of people on the planet? Seems fair to me. And we can make it a religion, and we can call it... oh, damn. All the good ideas are taken. We really are inching forward to the first case. How secret will the scary disembodied A.I.'s ala HAL be? Are they already here, and amongst the greatest corporate and government secrets? People of Interest, OMG! Damn, all the good ideas are taken. I think the lord would be honored by our using his/her name in an exaltsative fashion. Exaltsative? Hmmm. Anyhoo, next time someone really pisses you off, try honking their nose. It may get your ass kicked, but it'll be funny, especially if it's in front of a lot of people. Try to remember Grandpa's meep meep meep on the subway gang of thugs doing this "There's something on your nose" gag. Maybe that's my book... maybe I'm just talking to my daughter casually every day? This may be a Eureka moment! Make sure you tell Adi about Eureka moments... that's a good story. I deserve being able to stop and write like this. I will receive criticism for this, but you know what? They all spent just as much time thinking about things (if they're smart), only they're not writing then down, because they don't love to type, like your daddy does. I'm of the "We Enjoy Coding" line of coders... not the much drier "Don't Repeat Yourse'f" DRY line of coders. WET vs. DRY. Both have valid and excellent points, but only someone of the WET variety will admit that. The DRY folks are like monotheists... wait, I'm a monotheist. Hmm, am I a hypocrite? Of course! Everyone is, and there should be some sort of word for people who will presume to try telling you differently. Oh yeah, and I put an earring that popped out during rough housing back in Adi's ear, painlessly! I think I won big accolades from Adi from that. That might have been an important point about the weekend that I don't want to forget mentioning. Otherwise, nothing exceptional, but for chatting with Adi about the changes in our lives. -------------------------------------------------------------------------------- ## Sun Jun 5 22:17:49 EDT 2016 ### Wow, I Needed That Big weekend. Putting Catskills place up for sale. Adi helped me produce the video today. She's cool with it, and I'm getting myself psyched-up for a certain phase of badass bachelorhood in my life. And I can't do that with the boondoggle of this co-op. No rush, but no big commitment to the place either. I don't need to let go of it quickly, but nor do I want to hold onto it forever. My spending-power on a monthly basis becomes considerably larger when I live more comfortably within my means. Imagine spending just half of what I currently did in my life before the mortgage. The difference then was that we did not need to be very fiscally responsible, because our income so far outpaced our expenses, that... uh, I spent and spend and stretched myself thinner. Now, I feel outright poor. And I shouldn't. It's bad for my self-esteem, and I don't want to project that me onto Adi, nosiree. It's time to seriously entertain the notion of selling this co-op unit. Wow, what a wave of relief sweeps over me, now seriously entertaining. Wow, a reversion to rich. This will help a lot. Why not? Let Adi see me living much more modestly, but rich in so many other ways. Lower stress, better health, and inevitably as a result, even higher income and more free time... Levinux... somehow. Tribe. The Objectively Python is Best To Learn First tribe. Let others twiddle bits and recurse their lists away. Basically, they are all artistic creations in and of themselves, these interpreters and compilers. To really understand computers, you have to right your first compiler... often in LISP or C. Why anything else? Flip sides of a strange eternal battle. How much is code that can self-modify worth versus code that executes in the fastest, most optimized way possible, but without much chance for deviation along those lines, through wonderfully adaptive cleverness. That's C versus LISP, with C on the fast-executing, but locked-in-place reliability and predictable-ness. Then, along comes its kooky cousin that can save to memory the very thing it will execute next, which will be the next thing you save to memory, and so on. Systems that write their own internal, increasingly more optimized for your particular problem-domain, and abstraction level in iterative sweeps, until, it's just a matter of time befoer it becomes self-aware, or at least Turing Test-passing stuff. That's LISP. That's the one of the A.I. research community's original darling languages, that has settled into the common dialects of Common Lisp, Scheme and Clojure in modern times, with no sign of it letting up -- especially as it is the macro-language built into emacs, which unites one of the great editors in history with one of the great languages... the other being C, which was the key to the universiality of the Unix operating system. With C, all you need do is write a C-compiler on the particular platform you're on, and you can be pretty sure that it will compile and run on whatever hardware you're on, because all that hardware will be capable of emulating the sub-set "hardware", as in a sort of sub-set of machine instruction sets that can have an equivalency-mapping to just about any major hardware these days, and you'd have yourself a tidy little virtual machine system. In such a system, things coded well could run with rock-solid reliably, and become infinitely more optimized and stable over time, as you tweak things towards a prefect... static... state. This is why compilers are (at first, counter-intuitively, in my mind) associated with static code. Compiling what might be dynamic in nature into a snap-shot in some simulated hardware running-in-memory state, so that it can be immediately loaded into some well-known, previously ran that way on this hardware, state. That's what an executable, or .exe file, or elf file is. Ready-to-run code that you can just plop into running memory that's being accessed and written, in Turing Machine style, from RAM. And I must teach Adi a thing or two. This stuff will become so magical as to be unrecognizable as manufactured technology (versus nature-evolved) within her lifetime. I'm not saying exactly Bladerunner, but a whole lot like Bladerunner world, in many regards. They'll be human-like, because people like me will win, insisting that humanity will have no chance at all in coping with a completely and entirely alien intelligence, as it becomes aware of us pesky humans unless they themselves feel very human, if even in a sort of pompous look how many fleas I carry around (humans) like a future machine's notion of a big dick. See how many humans prefer hanging it with, on, in me? No original ideas there. That's Ian Bank's totally awesome, incredible, is it time that I finally read the Hydrogen Sonata's series called the Culture. Yeah, that might be just what my head needs now, thinking about the still-within-Adi's lifetime world. I wonder if she'll ever be reading this? And so yeah, if she ever does, congratulations! I thought you patience, well. And in putting my writing here and "out there" on the github.io system, I will clearly have to get this, or some future version of this, to exist on the non-remapped apex domain version. It's currently on mikelevinseo.com, which I currently have registered, and will probably expire and disappear, along with an abandoned github repository, after my death. And so that will be my first Python experiment. The robot, or system of robots that will continue typing from my accounts, after my death. That seems worthy. My life, chapter 1. It took me forty five years to get here to this place in my head. But I finally got here. And I am slowly becoming a capable citizen in this new Age of Communication, formally and prematurely labeled The Information Age, as if the age you're in could be enlightened enough in the earliest days of its rise to cut to the essence of the change. Information Age? Nope. We're all data. Infinite data exists everywhere, at all times, and we're just now getting the knack of making simulated little worlds of infinitely predictable processes, but at such large scales, that the predictability is lost, and seemingly genuine and convincingly sincere true randomness and living-thing free will in sufficient degree to at least control the random nature of one particle, on a different "dimension", such as it were. I think the movie Interstellar was driving at this. Particles matter, because they are us -- at a very weird angle, I can tell you. But that is what the language LISP is also trying to tell us. From chaos springs order. From that order springs slightly more interesting order, and so on until arms and legs sprout, and particles can crawl their way out of the goo of some already infinitely improbable foamy edge of exist ice. We're the stuff that's thin-layers between most of the stuff that's there, which is vast void of virtual nothingness. But what do I know? I bet on the Amiga, and fell in love with a computer company in my back yard, which in a bizarro multiuniverse out there, is in the role of IBM, Apple, Nintendo and Sony all rolled into one. In that world, your daddy is a lot like Jeff Bezos, turning Commodore into the perfect James Bond super-criminal type megacorp. Jack Tramiel would be supreme overloard of the Commodore nation. In that world, I took a very different path. Crash... burn... OMG! How could a billion dollar company, and the particular one that I loved and had all sorts of in's to just up and disappear? Right as I'm getting out of collect, and could have used that secure thing to jump right into locally, as my father died during the week of my graduation. Wow, what a head rush. My dad saw me graduate, then died just 2 weeks later. Shit, that fucked me up. Double whammy right there. That was it! That was the point my head reeled, and I was just the worst possible person to have to take over a check cashing organization. Horrified! Bad advice received. I plead "just kid". But taking advice the way I did, I didn't keep using my Dad's lawyer. He would have been my local guy, who'd make sure I'd be looked after. I took the advice of my Mom's brothers. Real estate people. Ha ha! Yep. Moments isolated. I was too young to not have a greedy lawyer by my side looking out for me, like a surrogate dad for a couple of months. No, I fucking stepped right into his shoes. BAM! Big mistake. But I wouldn't have ended up with Adi, if that were a mistake. And so... we are what our lives and our decisions make us, and our decisions are not always good. But even our bad decisions are part of the continuum that makes us us, so who really is to judge. As of this writing, and to the best of my knowledge, we're all human, and all our shit smells just as bad, and we all die after a century, if we're lucky, but in truth, usually less. Think about generations, and how each is different. It's real stuff. I didn't think so so much at first, but now I see, it is. And that's another reason I'm making Levinux. Time to compile my own QEMU binaries again, finally? Probably. THAT is the hard core part of my tech education. Tearing that process down and doing it successfully, and then iteratively better as Levinux QEMU states are increasingly looking like "running" information Noah's Arcs, as it is deliberately being designed to be like. Wow, I needed that. -------------------------------------------------------------------------------- ## Fri, Jun 03, 2016 2:08:32 PM ### Hand Crafted Audit vs. Systemification The way to make something interesting to yourself is to think it through out-loud. Finding the generalities of the project is what always makes it interesting for me. 1, 2, 3... 1? Fri, Jun 03, 2016 3:55:47 PM Ah, screw it. All this system stuff is premature. I just have to answer these questions the manual way. Get out of the habit of trying to make a system for everything. Resist automation and system building here and there. Got a preliminary answer out to the stakeholder, as it's already been over a week since that question came in, but it's not really a quick one to answer. I HAVE TO MAKE these things quick to answer. Remember that most SEOs don't have one-tenth the capabilities you have, and lean heavily on paid-for tools, and I have to learn to do the same in these cases. Half of life is disposable one-off work, and the other half is just getting better at delivering the stuff you have to do over-and-over more quickly. Okay, so... fix the time that the cron job runs. It finished at 8:46 AM EST, even though I appear to have set the cronjob to 3:00 AM. What's even more odd is that the timestamp on the AWS instance is 12:46, which is in Oregon. So... 3 AM local and midnight, which is 3 hours behind us, so that sort of makes sense. However, I'm rather suspecting I'm looking at GMT times. The difference between my timezone and GMT is 5 hours, which is how much later 8 AM is than 3 AM. So, if I set 3:00 AM and got 8:00 AM, then I have to set 10:00 PM, assuming that will be interpreted as 3:00 AM EST. Yup. And that's 22:00 in military time, which is used in crontab. -------------------------------------------------------------------------------- ## Fri Jun 3 11:19:08 EDT 2016 ### The Birth of a New Project, and an IPython Decision My thinking makes all the difference in the quality (and speed) of my work. The quality goes up, but the speed goes down. Increase my speed. This is part of my pursuit of becoming a lightning bruiser. But you can't become a lightning bruiser if you plan on carrying around a lot of weapons. Being that big and that fast means donig most of the things you can do with a realatively efficient utility belt. Assume I already have all my utility belt tools. Now, think about process, data structures, and APIs. Okay, one of the things I liked from recent work (report.py) is to structure jobs as JSON data structures. Start structuring the "audit" of a single property, but know that you could load it up with lots of properties... and lots of particular URL starting-points and lots of attribute values and even lots of keywords that might be bound to more than one property for the investigation. Hmmmm, okay. And don't chase the rabbit. Make this as straight forward as possible. sites = [ {'name': 'SomeSite', 'site': 'http://www.somesite.com/', 'apex': 'somesite.com', 'dockey': 'googledockey', 'gaprofileid': 'theid' } ] This is what I use for report.py, and it is a good start. But do I really want to start with a JSON config file? Doesn't the bookmark approach still apply? Oh, and actually, aren't requests supposed to come in via a simple table structure? Oh, of course! The kernel of an ACTUAL Pipulate 2... Decrufter. Hmmm. Yes, Decrufter is the specific application of Pipulate, and Pipulate 2 is going to be an application that gets EXTRACTED from a functional Decrufter. Yes! That above JSON structure doesn't even articulate "the ask" of the site audit. Hmmm. Okay, doing-it-over new project criteria. Pipulate will continue to operate in left-to-right, top-to-bottom "lawnmower" style sweeps. But it will be common practice for Pipulate to be able to transpose rows and columns -- essentially do a 90-degree rotation of the table before it begins pipulating. In this way, the names of functions and input-parameters can either be on row-1 OR column-1. This may cause confusion, but it's something Pipulate should be able to internally detect and adapt to. Ugh... many signs are implying that Pandas should be in the picture. I get a lot "for free" but it seems to potentially make it too heavyweight for Levinux. Think it through. No rabbit holes, but also no really terrible decisions one way or the other. Lightweight vs. power. The audit should start out wide and shallow. Don't generate a lot of data. Focus on collecting up a bunch of aggregate results. I'm leaning away from Pandas right now and towards lightweight. There could be a potential seminal moment coming up on the birth of the Decrufter project, but I think I'd like to carry that out as YouTube talking-head coding performance art. Don't hold today's work up based on that. Plow through today's work, just keeping in mind the conventions you're inventing and will be often repeating in the near future, expressed as a more general system in another repo. So... one more private repo? Or maybe keep it all in IPython (Jupyter Notebook)? Yeah, that's a good strategy for today. Use it LIKE Pipulate, but from within an IPython notebook. Mmmmm. -------------------------------------------------------------------------------- ## Fri Jun 3 09:58:11 EDT 2016 ### Finding The Right Dots To Connect Today ### Look For Commonalities Between Your Work and Personal Must-Do Projects Wow, the reports generated per cron.daily overnight. I didn't think so at home at first when I checked, but that's because it was because it was before 8:45 AM, which is when they finished generating and I was crestfallen. I was relieved when I came into work. The original problem turned out to be simply the missing shebang directive in the extensionless bash script in /etc/cron.daily/. I did everything else correctly. I'm LOVING listing to Steven Levy's Hacker's book, and wish I read it long, long ago during my Commodore days. Those Commodore folks really did attest to steer me in the correct directions here and there, but I just didn't always get the message. I think I was a little too paranoid, working stuff out, and not understating the background, context, and human motivations of it all. But more essentially, I was just not ready to receive these messages. I am slow up the uptake. I proceed forward in bits and spurts, and then get distracted on these bizarre pedantic details (lifted language directly from Hackers). It's time to create a "required reading" section of my book outline. Turn out a featureful product. This IS a showcase for tricks. This IS homebrew. Progress Levinux forward a little every day, like I do this journal. The natural thing then is to tie the two better together. Connect some dots others would be very reluctant to do. Yes! Work my personal and Adi-educational goals with my day-to-day work here at ZD. So much was invented BEFORE ITS TIME in the 70s and early 80s. So many other companies could have been Apple, if only they were Steve Jobs and Wozniak. They had SOME hardware and systems, but they didn't get all the little implementation details right -- and I'm not just talking about machine technical merits, but also the business details, priorities and such. Okay, now thing about TODAY! Having the to-do list at the top is nice. I should really try to meld my personal must-do projects with my work must-do projects. Look for commonalities. Try to start engaging in the Decrufter project. Systemize it. -------------------------------------------------------------------------------- ## Thu Jun 2 21:23:12 EDT 2016 ### Virtual Reality Will Be Used To Simulate Using Old Smarphones This place evolves. This place has much more information and edit-by-edit history and reshaping over the years, as my habits and presumptions about the journal change. Am I over-sharing? Of course! Could it get me in trouble? Maybe. Am I sanitizing as I go to try to prevent that? Yes, most definitely. Might I still slip-up, despite all that? I suppose. Might I go edit this here and there over time to favor sanitizing? Well, I deleted well over a year's worth of journaling just because I don't even want to be perceived as doing anything impropitious? Yep, I did that once. Hope I don't have to do it again, but I can always nuke the place for morbid (again). It's the only way to be sure. And I will gladly do that. But I do ask that you consider for a moment that view-source on a site reveals quite a lot more than anything I've said. You can even save the state of Google Chrome and inspect it under a microscope, so I think I'm in relatively safe territory. None-the-less, this place evolves. I have already taken the plunge of commitment and consistency, and unabashed non-apologim for my evolving coding style, as I learn more and more and try more and more, and direct my goals more and more. Am I SEO forever? What's the long-term plan here, Mike? Born an Amiga fanatic, transitioned to a squeaking-by VBScripter, and stuck in a fool-me-twice loop for a decade or more. Discovered Linux in 1998. Installed it on one machine. Still infatuated with Amiga and shrugged it off. Missed the point of the Free and Open Source Software world. Wandered through a dark mist, wondering why those VBScript and Deluxe Paint phantom limbs still itched so badly. First Commodore let me down, then Microsoft (I tried switching to .NET and failed -- along with Java, Ruby, and a few others along the way). Heartbreak after heartbreak, avoiding PHP deliberately, because I detected the seeds of another heartbreak. Fell in love with the concept of LISP from all its legends and layered-on increasingly abstracted, zero'ing into the perfect abstraction and API for the problem domain -- in essence, writing the one perfect language to solve the problems in your specialized domain. There's a lot of hope pinned on LISP to give rise to Artificial Intelligence. Code that can modify itself. Think Harvard architecture, more than Von Neumann. Writing data back to where you load program code from changes the program code you're about to load. Iterate (or recurse?) sufficiently along these lines, and you just might create a truly stable thinking intelligence, certainly that could one day pass the Turing Test. My suggestion is that if computers can learn just a little bit on today's clumsy scales -- say for example, insect intelligence -- then just think how many you can hook together in a neural net tomorrow so that they can think a lot. Not a matter of if... only a matter of when. And if you don't believe that, then the argument for the evolved computers in our own brain would be a difficult one, and we'd have to concede a lot to the Bible, or whatever. Because if we can't trust what our own senses are putting together sufficiently to produce the miracles such as me being able to communicate this to you right now, and believe that at the rate we're going, it won't eventually lead to some version of what we today think of as A.I., then we might as well believe anything. The evidence is overwhelming that machines will eventually gain intelligence. It's just not going to be easily or quickly. As fast in gigahertz and big in Zettabytes it might be, it's still just a soulless number cruncher if it can't enter a self-reproducing cycle to let a bit of self-improving iteration occur, until such time as the complexity and optimization of the machine can make it optionally indistinguishable from the intentionally xenocentrically phrased "real thing". Ah! A prediction. Starnet will actually be named Zetta. Terabyte drives are common. We flew past Gigabytes. The era of Megabytes did indeed last for awhile. Then, there were floppy disks before that. The first Macs were 512K (half-a-floppy's RAM memory) and the first computer was 256 Bytes, and you could program it with switches on the front panel -- the Mits Altair 8800. Many of the original pioneers of the computer industry are no longer with us. I remember Jack Tramiel's passing in 2012. That touched me, and is around when I bought the Commodore 64x... and then Barry Altman passes... sigh. So anyway, that's my prediction, and either in promoting that end or defending against it, it's time to get more folks computer literate in the pre-neural nets being commonplace days. These are truly precious days, while we're still alone. There's a lot of formative stuff still to be done. Getting the whole world to see in a "good 'nuff" way how to master ***today's*** simple machines, systematically, and so as to be a player at the table of tomorrow's world is a decent goal. Make it easy to take up Python, and get you thinking about it in more than just a Desktop-bound way. Hack a server. Connect to a robot. Do a few interesting things... because you want to... because it can also be the subversive thing to do too (not because Daddy says hacking is cool -- which it is). And so, what we do is take a look at how things used to work -- not so long ago as to be irrelevant. In fact, it looks a lot like how nearly everything works today, as things settle in on a common Unix/Linux underpinning. Life's too short to have to know how to support many more than one or two fundamental underpinnings of things. Even the iOS/Android split is a little too much. Are the platforms really so different that it has to be an Objective-C vs. Java split? Can't we have just any old runtime on any old platform? Didn't Turing prove that? Shouldn't that actually be EASY for ever-more-powerful hardware? Shouldn't there be virtual machines within virtual machine within virtual machine to preserve and keep usable every phone you've ever had, loaded with all its apps and data? Shouldn't any hardware / software system snapshot-in-time state be easy to archive and switch-on whenever desired, at least in some virtualized form. Maybe there's a use for the Oculus Rift after all -- to make you think you're holding and using one of history's man/machine interface platforms. Oh yeah... Apple's licensing. Sorry, folks. Or maybe the Hacker... (oh, wait, should I say Maker these days?) community will fix that. What exactly did we buy the rights to with those old phones many of us still have? To virtualize them? -------------------------------------------------------------------------------- ## Thu Jun 2 21:14:17 EDT 2016 ### I'd Love to Be A Culture Citizen (the Ian M. Banks SciFi series) Imagine yourself being able to re-wire your own brain. You can. In fact, you should. To help optimize it, so you can better observe the world around you, so that you can draw better, more accurate conclusions from your experiences, so that you can alter your future behavior for less of what's bad and more of what's good. Wouldn't that be great, if that were a reality? Where you could just sit down and learn something, if you put your mind to it. To add a whole other language and vocabulary and way-of-thinking to your already existing capabilities. But now imagine if this different way of thinking could let you control machines... to automate things... to program the very same robots that you, yourself designed. You want Polkadot Girl and Ballerina? Well, make them. Draw them, animate them, program them, make them real. This is the future you're going into Adi -- where those who know how to become expert users of particular tools -- and language is a tool -- will be the rulers of the world... while everyone else is just sort of being the human-component in whatever machines exist that still haven't figured out how or whether they even want to get rid of those human components. Go read Ian M. Bank's Culture Series... return. See? Real early on in those days, hopefully. There are bright visions of what a species like ours could become someday. I'd LOVE to be a Culture citizen. -------------------------------------------------------------------------------- ## Thu Jun 2 20:14:29 EDT 2016 ### Funnel Your Subversiveness For Two Marshmallows Tomorrow Hmmm. Do dishes first. It's been long enough. I have to live up to my end of the bargain, now. I've never been neat, but I've always been organized, and I want that again. I'm on the verge of hopelessly disorganized, and I have to pull back from there. Shit will change fast, and you have to be ready. I have to believe that this is the calm before the storm. I have to always live within my means. And in addition, I do believe that I am smart enough and motivated enough, and it is never too late enough, for me to expand my means. Yeah, expand my means. Step 1: establish audience. Slowly, but steadily. Win the war for followers by attrition... ha, ha, ha! Well, I'm certainly wearing away anyone who actually reads this stuff I write here. And now, I eat. Even on a night light this, I need sustenance -- but, barely. See connections more quickly. Take action and advantage of those connections more immediately. Scare yourself into action for fear of losing out to someone who things of a better idea sooner than you think you have left to complete Pipulate. I am going to bridge the gap between the technical and non-technical through a neat social hack trick that's going to introduce Jane-or-Joe Everyperson to Unix and Linux-like operating systems, through the old-school text-on-terminal interface. That is to say, no graphics to speak of, if you don't count the art you can still make happen with the ASCII character set in a few different xterm colors. Okay, there's a shitload of tech traditions. I'm not going to pretend like Unix and Linux are rising competitor-free into the ascendensphere of common and ubiquitous and familiar role as the plumbing of the Information Age era -- the era of communication, wired-together by the genius idea of piping outputs of devices into the inputs of others, piping together various single-purpose, yet multi-function components. Yeah, Unix had done a lot of things right, and the GNU project, usually running on the Linux boot kernel, has done a fine job of pseudo-emulating that. Together, they combined with the tcp/ip protocol that was powering the newly formed Internet, and you add to that a way for the common-person to access it all through hyper-linked documents in graphical terminal software (web browsers) and you've got a winner. A lot of fine systems have been clobbered on Unix/Linux's meritorious ascendency (which I shall together refer to as *nix from now on), my beloved Amiga not being the least of which. But alas, Amiga was much like a smaller child to Unix, for I recognize like dozens of similarities, right down to command-line piping, and what now amounts to inter-process communication, but instead generally called APIs (application program interface). The Amiga multi-tasked so well, that it was an expected thing that you might script ADPro to control a scanner and a piece of terminal software, so you could take a picture, stylize it, and transmit it somewhere else. Of course, this was before the web, so the most transmitting you could do is dial-up to a BBS and upload. Still, this kind of yawwwn-let's-automate mentality that built-in AREXX provided over 25 years ago. Dead. Oh, so dead -- even though I saw The Amiga Book, The Ultimate Guide to the Amiga in Barnes & Noble the other day. But that was almost literally thirty-to-one, with 25 of them being Linux-related stuff, and the remaining five split over Windows, Mac and stuff. This is not counting video games. Oh, the free and open source world has so won, and now's just a matter of ticking off the years until home fab kits can bake your own computing substrates from goo in order to build the most wonderful things that can... bake computing substrates from goo... oh, I see where this is going. 20 years? 50? Certainly not much more than that, so either during my-and-Adi's lifetime, or just Adi's. I think this is a world we must prepare for. Funnel your subversiveness for two marshmallows tomorrow. -------------------------------------------------------------------------------- ## Thu Jun 2 19:03:41 EDT 2016 ### Okay, Now Get Cleaning It feels so good to use such a massive screen, and to use equally massive text on that screen. Though what I call a massive screen is really just 24 inches. Still, 80-column format full-screen to nearly fill that makes text that's at least a half-inch tall on the screen. My aging eyes thank me, and the inner-me that's still mourning the Amiga computer revels at the excellent implementation of a full-screen mode, which even beats Kubuntu, Ubuntu and Windows 10. Apple's horizontal ribbon multi-finger left/right swoosh user interface is easily as fun and appealing to use as Amiga's Amiga+N to cycle the screens, and a right-button drag-down on the menu bar to do some stupid Copper tricks that would stupify Mac and Windows users from back in the day. Today, it would seem a little less than amazing, but to happen on an under a thousand dollar computer was almost beyond belief to all of them, and they would inevitably dismiss it as something other than a computer -- a game machine that could never possibly host serious computing applications... Video Toaster. Anyhoo, I have tonight. I have only tonight, and I have little else. I am going to force myself into more productive behavior tonight, and WITH ADI in the Catskills this weekend. We didn't go up there last weekend, opting instead to stay local. I have to get the Catskills place in good enough shape to show for selling. I'm going to be moving over to the bungalow colony across the street. I now need to spend very little money. I need to tighten my belt. I need to budget myself, and track my finances like I barely ever had to do before in my life. I might even say that I'm getting some of my real first taste of adulthood -- not by getting married or having a kid or taking out a mortgage, though I have done all of that already. No, this is the first time in my life that I have not felt precisely like the "kid" I always remember myself being. I could go right back in my paper journals to 18 years old, to track the lines of my thoughts. I am me. I get passionate about things. I get particularly passionate about the creative process, where mastery of some sort of tool is involved. It's odd that I'm not big into sports or music, where that is a big thing. No, my passion as it seems to be turning out, is in using formal programming languages as that creative tool. I never friggin' even knew how much I loved this stuff. Problem being, I've always felt that I've been a little slower on the uptake than my private school suburban neighbors, or the innate geniuses who just sort of seem to do calculus in their sleep. I hated calculus. It kinda sorta defeated me... although not as much as Physics II. Oh, how I coveted the image of Engineer in my head -- the true creative geniuses who MADE STUFF that worked. I mean, imagine shaping the world. Or some other world, like Mars or the Moon. These were my engineering dreams, that I gave up having seen how awesome the artistic melding of man and machine could be, in the Amiga Computer, kinda sorta from, but not really from Commodore. Oh, Commodore, the heartbreak! An exercise in some "getting it" but those who got it (it, being the money) themselves not "getting it", and all the people who DID get it, getting chased away or otherwise having their dreams shut down -- from the C65 team to the original Amiga Lorraine creators. They got something right, those creative artists who made the Amiga. It was a perfect mating of machine and clever software tricks... minus any form of memory protection, so the Amiga was often more "performance art" than serious work machine, though there are many reports of decades-old Amigas still in the field running this automation-control operation or that odd thing. And the Amiga freaks even manage to keep Amiga Magazines in circulation, which I just saw at the Union Square Barnes & Noble, and could hardly believe it. Okay, on top of this natural and unnatural predilection to a piece of hardware, I actually ended up ***working*** for Commodore in my teens, and then worked for a Commodore spin-off (essentially) after I graduated college (this was so important to my dad to know I had a job waiting) who still held stock in the Amiga, my beloved. And then, slow, long, agonizing death. And then, the dark ages. Nothing pressed those old techo-artistic buttons again, for a long, long, time... until... until a gradual dawning on me that the dust was all settling of the computer revolution, and distinct categories of both hardware and people are coming clear into vision. Hardware will gradually commodify to the point where even Apple will have a difficult time carving out impenetrable customer allegiance on much else than quality and style. Everything will essentially be some form of virtual machine, with dances of application requirements and platform handshaking, discovery, and optimization hooks kicking in to let your semi-secure apps travel anywhere with you. The cloud is and will continue to be a thing, but enough versions of ultra-secure private clouds now exist, that you could be carrying around a non-Internet-connected datacenter in wearables on your body, and hardly anyone would know, but for any inadequately shielded electromagnetic emissions you gave off. Computing by the cubic-inch... mostly general computing, because it can be. The "old stuff" as existed during the rise of the computer-powered Information Age will continue to be in use, which are generally both the Harvard and Von Neumann computer architectures. More Von Neumann, as we have separated different types of computer memory, but both clear instances of Turing Machines, as described by Alan Turing long before computers even existed -- or, at least only existed as rooms full of people working out math problems together, each working on their own small parts assigned to them. In a way, those earlier rooms of "computers" (computers in this case being people) was quite a bit more advanced, due to it's already-neural-network operation than today's architectures. And so it will soon be, following IBM's answering the DARPA challenge, and creating the SyNAPSE chip, which operates more like that room full of people / neurons than the bottle-necked single-CPU-bound architectures. In another 20 years, we maybe won't have true A.I., but I'm pretty sure things will be passing the Turing Test left and right. Things are just going way too fast for that NOT to occur, and we had best get preparing. Things will regularly be able to recognize things, and respond in remarkably sophisticated, appropriate, and often super-human ways. Yep... robots. I've got maybe about another 45 years, if I'm lucky and take care of myself (I really have to start doing that). I have a lot of my dad's traits, I can see. My dad basically killed himself by not taking care of himself, and when he asserted himself and took control of his life, it was only to take on MORE suffering and loss-of-freedom, by virtue of becoming a fucking check casher! An owner of a check cashing store, but a check casher, none-the-less. He made a fucking study of it, and this is the thing he settled in on and decided was the thing he didn't have to train much for, and could just walk in and take over and keep it running... until I graduated college, and he could give up. At least, that's how I saw it. I saw what I recognize in hindsight and with the wisdom of being a dad now as despair, and a nearly unlimited capacity to endure it. I saw a determination and drive to get to the finish-line, and to make sure I had a fighting chance, and maybe even a little head-start getting started. But I was emotionally saddled by a whole bunch of shit, that even now, I have not completely unburdened myself of. I am myself a dipshit in a very many ways, and I must do the best I can to not pass this onto Adi. I must pass on only the best parts that I believe are worth propagating... and will my lineage even propagate? Up to Adi, unless... well, unless is still a long way off. I still need to get my shit together. Oh! Tonight's the night to get my shit together. And here I am writing it away! Ah, but no. Organizing the mind is a critical first step, because until you organize your mind, how could you ever hope to organize your environment, and perchance maybe even organize a bit of the world and a bit of human history. That is in fact what the big achievers of the world have done -- helped shape it and guide it and evolve it into what it is today. Soon, we will probably need those who are motivated and compelled to save it. Efforts will have to be coordinated and carried out at scale and with a precision that only robots will be able to do -- and probably the greatest threat of disaster even then is not environmental collapse, but corruption of the system that will be made to help save it. And so, safeguards and openness and rapid detection and rapid defenses will be necessary... both to protect from super-bugs, but also super baddies. The hacker and the free and open source software movement and the whole ethic of openness and sharing as the ultimate form of security will come under fire, and what we think of and call hacking today will come under fire. The idea of real ownership of anything in the information-technology world will come under question -- borrowed hardware to run borrowed media, temporarily loaded into only volatile memory off of centralized server/cash-registers under control of the .1%. Forget the 1%. Things will be at least 10x as extreme on all fronts, including how few have so much, and so many have so little. This is the natural state of humanity -- no conspiracies necessary. Go Google the Pareto principle, which will show you the natural state of affairs. Another of nature's curves is called the logistics or population curve, which we'll be getting intimately familiar with as we use up our fossil fuel and chop down our forests and race into the tens-of-billions population numbers. The way we could hardly imagine today's world -- or imagine it, but believe it far-fetched -- at the dawn of the information age, somewhere in the 60's and 70's -- the world in another 60 years. I'll be around to see a good portion of that, I think, and Adi most certainly will be. And so... and so... and so, I think and I write, as a priority of my life. I get my head together and give myself a pointed sense of purpose, to take me on and drive me into the next day and the next and the next, with more and better interaction with Adi, all the time. Make the tough corrections with Adi that are necessary to NOT let her unknowingly or unthinkingly become a mean person. I already had to put the kabosh on a few absolutely unacceptable behaviors, and I think I made an impression on her. I don't think that Adi really believed that I had it in me to send one of her friends home during a play-date for bad behavior -- exasperated by being my "precious" weekend time with her -- where, presumably she believes she can do no wrong -- or nothing so wrong that she would actually be punished for. Wrong, ha, ha! I think Adi just met the part of me that won't take gratuitous meanness at my expense. I think I've rather surprised quite a few people that I've known over the years who thought they were taking advantage of me, only to find out I was much more in control of my situation than they believed. Saying No when someone else really, really, really wants me to say yes is the quintessential example. Fake crying over material shit doesn't impress me. Wiring up her brain to be a fine and capable adult human being with a strong streak of individuality and kindness does. Help her. Help her however you can, even if that means cleaning with her. Speaking of which... -------------------------------------------------------------------------------- ## Thu, Jun 02, 2016 2:30:54 PM ### Jupyter Notebook (IPython) now an everyday thing on Windows for me Okay, this tmux full-screen thing is working pretty well now on Windows, Mac and of course Linux, so my command-line experience across platforms is getting pretty unified. I'll also try sticking to the tmux defaults (except for activating mouse support) for now, like I have with vim and plugins. Better to be operational and comfortable on 90% of the machines I sit down at, rather than 99% operational and comfortable on 10% of the machines I sit down at. Keep my dev-system build as close to baseline as not masochistically possible. Okay, it's time now to competitively look at sites. Think through my tools. Hmmm. I still prefer working on Windows for native OS speed, and officially office-sanctioned work machine reasons. And I want to be able to do Python stuff from my desktop as part of my investigation, without having to struggle with compiling dependency issues (even an issue with the pip installer!). Okay... Anaconda. Once that's installed, it's just a matter of typing: jupyter notebook Okay, I have Python 2.7.10 under Cygwin, so I'll go with Python 3.5 for Anaconda, which is recommended anyway. I like what IPython Notebook lets you do in terms of documenting and isolating parts of an investigation. Hmmmm, I should really consider making the next version of Pipulate work as a really flexible module to import under IPython. Okay, after Anaconda is installed, you can start IPython by just typying jupyter notebook into a Windows command console, but you have to be careful about what directory you start it from. It's actually possible to start it from a location where you don't have write permissions, and you'll get the error: Permission denied: Untitled.ipynb ...when you try to create a new Python 3 file. Okay, I not only fixed that, but I started it from a Cygwin MinTTY shell, so I can now interact with the files here in tmux easily, and use my typical Cygwin ~/ home location, where I keep my git repos too. Nice. It's easy to look there, and drop files in that location. This will make ad hoc Python investigations on the Windows side very easy, and visual and interactive. These IPython notebooks are much more fun to work with than Python's native interactive mode. Hmmm, what's that word? Oh yeah, REPL for read–eval–print loop -- a.k.a. interactive toplevel or lanugage shell. Makes sense. Hope that toplevel thing isn't a Wikipedia billicat. But then again, so much is. 1, 2, 3... 1? Already installed Anaconda and got IPython working. Now, do the thing that I originally them for -- a quick use of the Python sets. I should be able to just slam data around -- say for example, from one column in a massive Excel sheet into a text file via vim, and then use IPython to do Python manipulations against that textfile data. This is NOT rabbit hole stuff -- although, I thought for a moment that it might actually be. I COULD actually dedupe a column in location to get a count of uniques. But then I can also copy-paste the column out of excel into a text file, and use this Python: myset = set() with open("words.txt") as words: for word in words: myset.add(word) len(myset) Hmmm, I wonder if I can set the indents in IPython to 2 space. I don't know why everyone is so fixed on 4 spaces. Yeah, I know Pep 8, but a foolish consistency, and all that. Two spaces forever! -------------------------------------------------------------------------------- ## Thu, Jun 02, 2016 1:58:19 PM ### tmux Path and Under Cygwin I had installed tmux through the cygwin installer, but just typing tmux did not launch it. Paths! Ah, add to the book outline. I have to get find, grep and paths down once and for all. $ find . -name "tmux*" ./install/bin/tmux.exe ./install/etc/setup/tmux.lst.gz ./install/http%3a%2f%2fcygwin.mirror.constant.com%2f/x86_64/release/tmux ./install/http%3a%2f%2fcygwin.mirror.constant.com%2f/x86_64/release/tmux/tmux-2.1-1.tar.xz ./install/http%3a%2f%2fcygwin.mirror.constant.com%2f/x86_64/release/tmux/tmux-2.2-1.tar.xz ./install/usr/share/doc/tmux ./install/usr/share/man/man1/tmux.1.gz ./install/usr/share/terminfo/74/tmux ./install/usr/share/terminfo/74/tmux-256color ./usr/share/terminfo/74/tmux ./usr/share/terminfo/74/tmux-256color (tmux.exe found / Ctrl+c'd out of find) $ ./install/bin/tmux.exe [exited] $ export PATH=$PATH:"/install/bin/" $ tmux [exited] $ vim ~/.bashrc (added export command to .bashrc) This touches on so many things. When things appear broken, suspect paths. Always suspect paths. The essential trick of environment variables between Windows and Unix machines are very similar. In particular, the command: export PATH=$PATH:"/some/location/" ...and the fact that this only lasts during the currently active terminal's duration, and how if you want to make it permanent, you have to commit it somewhere else. In the case of nix-OSes, it's just in a .bashrc in your home directory. Under Windows, it's some Right-click on Computer/Properties, Advance System Settings / Environment Variables / knowing what-the-hell to do there convolution. And God knows how it changes from Windows version to Windows version. Unix wins again. Oh sheesh, environment variable mapping under the impending Windows 10 Ubuntu Bash Shell is going to be a nightmare. -------------------------------------------------------------------------------- ## Thu, Jun 02, 2016 11:37:43 AM ### Debugged cron.hourly (shebang directive needed) Look at that to-do list I just put (once again) immediately above the most current journal entry. This is a convention I should really stick to, and update the (sterilized) to-do list accordingly. Okay, let's start with: run-parts /etc/cron.daily Wow, just switched my daily journal terminal over to my VirtualBox running Kubuntu. Nice. The markdown color-coding is much nicer. I have to check my vim color coding preferences. I think I override the default for Python here or there, but maybe it's time to take the defaults again. Must look into. At any rate, let's understand what's going on with cron on Kubuntu (and by extension, Ubuntu and probably Debian too). Hmmmm. Look at /etc/crontab first. # /etc/crontab: system-wide crontab # Unlike any other crontab you don't have to run the `crontab' # command to install the new version when you edit this file # and files in /etc/cron.d. These files also have username fields, # that none of the other crontabs do. SHELL=/bin/sh PATH=/usr/local/sbin:/usr/local/bin:/sbin:/bin:/usr/sbin:/usr/bin # m h dom mon dow user command 17 * * * * root cd / && run-parts --report /etc/cron.hourly 00 3 * * * root test -x /usr/sbin/anacron || ( cd / && run-parts --report /etc/cron.daily ) 47 6 * * 7 root test -x /usr/sbin/anacron || ( cd / && run-parts --report /etc/cron.weekly ) 52 6 1 * * root test -x /usr/sbin/anacron || ( cd / && run-parts --report /etc/cron.monthly ) # /etc/crontab (END) Okay, that's telling. First, I figured out how to copy text from a tmux panel. Just as quickly as I see that's an issue, I google-up an answer, and in some ways, it's **better** than copying-and-pasting from non-tmux terminals, because Ctrl+b, Enter actually removes ambiguity about modifier keys that need to be pressed to keep a vim-highlight separate from an OS-highlight, which you can see very clearly with the orange highlighting. Also, I discovered the panel shuffling magic of Ctrl+B, Spacebar. The discoveries here is that: - All scheduled scripts run as root -- useful for testing. - anacron is actually doing the work, and not cron This URL has some suggestions: http://stackoverflow.com/questions/4984725/how-to-test-cron-job username@hostname:/etc/cron.daily$ sudo -u root -i [sudo] password for username: root@hostname:~# run-parts -v /etc/cron.daily run-parts: executing /etc/cron.daily/0anacron run-parts: executing /etc/cron.daily/apport run-parts: executing /etc/cron.daily/apt-compat run-parts: executing /etc/cron.daily/bsdmainutils run-parts: executing /etc/cron.daily/cracklib-runtime run-parts: executing /etc/cron.daily/dpkg run-parts: executing /etc/cron.daily/google-talkplugin run-parts: executing /etc/cron.daily/logrotate run-parts: executing /etc/cron.daily/man-db run-parts: executing /etc/cron.daily/mlocate run-parts: executing /etc/cron.daily/passwd run-parts: executing /etc/cron.daily/popularity-contest run-parts: executing /etc/cron.daily/reports run-parts: failed to exec /etc/cron.daily/reports: Exec format error run-parts: /etc/cron.daily/reports exited with return code 1 run-parts: executing /etc/cron.daily/sysstat run-parts: executing /etc/cron.daily/update-notifier-common root@hostname:~# And there's the error. Oops... shebang! Needed #!/bin/sh at the top of the file, and that is all. Sheesh! Okay, don't forget that one again. When doing cron: 1. No file-extensions! 2. Absolute paths 3. Ability to run as root 4. Bash files must have shebang directives 5. Execution bit chmod +x filename must be set Okay, time to switch back to my Windows 7 desktop. I think I'll get a tmux window running in a Cygwin terminal over there. I can't get too much tmux practice now. -------------------------------------------------------------------------------- ## Thu, Jun 02, 2016 10:48:24 AM ### Making That Book-writing Part Of My Daily Workflow I watched the Alan Turing documentary on Netflix (finally) last night, and wow. If I thought my work at Scala was thankless, what the UK did to Turing after WWII was epic and criminal and full of overtones of becoming like one's own enemy -- the UK becoming intolerant of different lifestyles to the extent of chemically castrating and sterilizing their own citizens. There's no more reason for you to sit here and read this than there is for you to sit and listen to the random thoughts going through the head of another person, who is not deliberately organizing and preparing any of those thoughts for public consumption. It's really just a lot of white noise, with the occasional blip of meaning... which I try to capture and then organize into location. That last bit of ultimate organizing... the producing the REAL product... THAT is the trick, and THAT is one of the last dots remaining to be connected. And THAT is what I need to work into my day-to-day work. One thing's for sure now, and that is that THIS journal is a constant. There's much more even here in THIS journal than what appears here on the page, which I blanked during my Puerto Rico trip when I over-shared personal stuff and blanked it. And before that, I kept paper journals (since 1988). And somewhere in there, I kept a Webmaster Journal at Scala which I still have in text format somewhere, which I will probably add to this -- maybe inline, and maybe as a linked-to leaf -- not sure yet. But continuity is key, and continuity is there. But continuity is not enough on its own. Continuity has to couple with thoughtfulness, insights, and organization with purpose. THAT is the "extruding" the book concept that I developed earlier. I must actually get about actually extruding that book. Focus my message. Get more things working for me 24x7, per my semi-automation majordomo sorcerers apprentice intelligent agents doing your bidding messaging that I seem to be dancing around. It's time for a new definition of legacy computing. -------------------------------------------------------------------------------- ## Thu, Jun 02, 2016 10:23:51 AM ### Switching Host-OS Virtual Desktops While Running VirtualBox Full-Screen Okay, one of my big breakthroughs was a very little one. A mere tap on the right-control-key from within VirtualBox momentarily stops the full-screen virtual session from capturing the keyboard, so a Ctrl+Alt+Arrow key will at that moment actually work, and bring me back to the host environment with a VirtuaWin-enabled virtual screen switch to a Windows Host screen. Pshwew! Maybe I'll avoid carpal tunnel syndrome after all. The keyboard shortcut I was doing before that was super-convoluted, and various Googling sessions and trying things from the VBox menus was fruitless, until finally today when someone suggested pressing the Shift BEFORE the right-Ctrl key... at which time I tried JUST the right-Ctrl key... and it worked. Sigh, go figure. Anyway, I'm not that deep into it, and I'll still be able to reap the benefits of VirtuaWin screen-switching with VirtualBox... maybe not seamlessly, but not too bad either. Okay, that's more delaying tactics before ACTUALLY getting to work. Now, think! - Debug why /etc/cron.daily didn't run last night - Answer the question regarding what this other website is doing. - Investigate the Google Analytics feature now in beta that gives you Webmaster Tools (Search Console) data. - Prepare a function to discover homepage URLs given just the name of an organization, and specific subject-matter pages within the site. - Investigate approaches to getting Darryl's Excel-based tools into a fast Excel-like Web UI -------------------------------------------------------------------------------- ## Thu, Jun 02, 2016 9:51:57 AM ### On Petty Thoughts It's always interesting when something becomes the biggest thing in someone's world for someone you know, and to you, it's something you can hardly even bother yourself to think about. I put everything now in the perspective of helping to raise Adi. Petty is easier than ever to see as petty. In addition to raising Adi, there's ensuring I have a reliable (and NYC-sized) income to let her continue to live (at least, part-time) in the Manhattan home she grew up in. That would be nice to keep, and thinking-ahead, this property is just going to become more and more valuable. Sure, there's always maintenance, but if it weren't a co-op, there'd be real estate tax. Many an estate home has been sold because they couldn't pay the taxes on the property. I wouldn't be the first, if I turn out wanting to save a little bit of money. But that sort of compromise is for the weak, who are unable to maneuver into better positions, increase their income, and all-around find creative solutions allowing one to live the life they want to live. One question is actually whether I'm living the life I want to live, or whether I haven't actually been artificially "locked into" a native New Yorker lifestyle by having a New York City native kid. And I'm not talking about an outer borough. Adi is a Manhattan kid, and I'm doing what my father failed to do -- actually LIVE IN New York City. The topic came up with him more than once over the years, I recall, and the answer usually was something along the lines of us not being able to live the sort of life that we're living. He didn't earn enough to live in the city, or probably even West Chester. He bought into a brand-new development -- the Philly sprawl-bergs. Okay, well then. That's enough introspection for the morning. The reports failed to run overnight, and it's time for me to debug /etc/cron.daily. No reason I can't completely understand why it didn't run and fix it. No reason to set up a separate physical server (yet) to get off my desktop virtual machine. I need to get myself together on multiple fronts. I'm reading Hackers: Heros of the Computer Revolution, and I'm getting to the part where they start talking about the MIT sour-puss price of continuously sustained and competitive zone-diving. They sorted everyone into winners and losers, and a lot of the loser behavior that was criticized by the MIT AI-lab crew seemed to be well-rounded life stuff. It's a very up-close and personal look at that special attribute that helps those with the potential to actually excel at what they do -- the nearly unlimited (exhaustion-bound) ability to focus on the problem at-hand. If I bring even 1/10th of the hacker ethic that they had the energy for at between 14 through 21 years old to bear on my own work at 45 years old, I will be able to "catch up" to where I should be, and give Adi a tremendous head-start in life -- in our modern life, and the new world that we're going into, which will be as different 20 years from now as today is from 20 years ago -- like from before when supercomputers fit in our pockets. Yep, I've got a good grip on what's important. Just don't ACTUALLY go broke, or get thrown in jail for anything, and I should be fine. -------------------------------------------------------------------------------- ## Wed, Jun 01, 2016 2:01:51 PM ### Breather and Contemplation Wow, was that a test. Frig! Okay, remember to be systematic and methodical. As soon as you get that panicky try-stuff-fast feeling, you're on the wrong track. The thing now is reproducibility and documentation. Do a pip freeze and make it part of the project. Wed, Jun 01, 2016 4:35:35 PM Pshwew! Just gave my counterpart in Tech access to the private repos, the credential files, and a demo of it working. Then, I went to one of the properties and did an update meeting, including showing the reports. Hmmmm. Okay, think through next steps. I have a late Triweekly meeting coming up in a half-hour. The most important thing I can do is to make sure that my virtual box REALLY triggers off on a scheduled basis, beginning tonight. Why did cron.daily not work on the last go-around? What are my other options? I have to start rising to that level that I believe myself to be at in my head. But also remember that behind every lightning bruiser is a colossal mishap just waiting to happen. Moving a lot of force around quickly has dangers. But I don't think I'm even looking all that quick to my peers. THAT needs to be fixed. I need to refine my stupid decisions like NOT using either AWS/EC2/boto for everything non-personal-hardware that I do. Why mess around with Rackspace or Wable? How do those REALLY help you in the future? Professional skills! Your biggest experience to have with OpenStack at this point is to build your own personal cloud. Wed, Jun 01, 2016 6:19:32 PM Had a happy hour. Checked over my cron.daily job. Everything tests good, from its permissions to its run-parts --test /etc/cron.daily check. It has identical permissions to everything else in that folder (which I presume runs) and can execute when called directly and has absolute paths to Python and the script. It doesn't have a dot-extension. So, I think it must just be laptop sleeping and VM issues. I think I'll bring in my CuBox server tomorrow. -------------------------------------------------------------------------------- ## Wed, Jun 01, 2016 9:52:00 AM ### Switching Amazon AWS Free Tier AMI Instance Into Ubuntu Instance Okay, give a shot at working with the AMI instance you set up yesterday. You already have Dropbox working on it, and are not so far into it that you've fubar'd everything. Okay, work your way systematically and methodically through this. Check that Dropbox is still working. I put a file on the Dropbox location from my PC-side and am watching for it to appear from the AWS EC2 side. It's not instantaneous. Maybe due to t2.micro status -- or maybe due to the Dropbox server software and checking-cycle. Got to investigate if I'm going to rely on it. But sending archives via email would also have some delay -- oh, unless they were both sent and check from the SAME gmail account (for instance). Keep that in mind if Dropbox doesn't work out. Oh, it wasn't started! I had done a restart. I had to do this (in addition to just a normal start for this session). python dropbox.py autostart y Okay, you have to view yourself in a race. Get through this friggin thing. Don't get frustrated. Get done! Okay, get rid of the TWO Security Group contexts. Make it just one. Done. Check if posgres is running: /etc/init.d/postgresql status postmaster dead but pid file exists Oops, but not a valid check if I do that as my default login. I have to do it either AS postgres with pg_ctl or as root with ps aux. Yep, I see it running as root. Ugh! There's like already a bunch of Postgres VERSIONS installed on this AMI. Okay, I'm making a decision to speed all of this up. As appealing as going with an AMI instance may have been, I'm more familiar with Debian and Ubuntu, and I am comfortable with installing packages and building it up, instead of getting the wrong already-installed version used accidentally! Ugh. I'll lose the dropbox install/config work, but who cares? Go faster! Okay, I can't change an ami instance to an ubuntu one, but I can create a new Ubuntu one and delete the Ami one. The language is "terminating" an instance on Elastic Beanstalk... okay, done. Now I have a fresh Ubuntu instance and a new pem file. I also have two security groups again. Okay, just go through this systematically... 1, 2, 3... 1? Login. - Okay, the .pem and chmod stuff. Login via ssh for the first time. Done. - sudo apt-get update... done. - sudo apt-get upgrade... done. Okay, just like with the Ami instance, see what's installed by default on the AWS Ubuntu: dpkg --get-selections | grep -v deinstall accountsservice install (deleted a ton of stuff 8/4/2016) zlib1g:amd64 install Okay, Mostly Python 2 & 3 and libraries. Not bad. No PostgreSQL, so that's totally in my hands now and closer to my Kubuntu machine, and I can deal with that. 1, 2, 3... 1? Be systematic! cd ~ && wget -O - "https://www.dropbox.com/download?plat=lnx.x86_64" | tar xzf - ~/.dropbox-dist/dropboxd python dropbox.py autostart y I can worry about excluding directories later. Plow through. This is a fresh system now with a working Dropbox. Find THE BEST instructions now on how to get PostgreSQL installed on Ubuntu 14.04. I believe this is going to be the best instructions: - https://help.ubuntu.com/lts/serverguide/postgresql.html - https://www.postgresql.org/download/linux/ubuntu/ sudo apt-get install postgresql postgresql-server-dev-all Okay, done. Before doing the other extension stuff for the foreign table exposer, go through everything necessary to confirm Internet connectivity. ALLOW INTERNET REQUESTS sudo vim /etc/postgresql/9.3/main/postgresql.conf CONNECT TO TEMPLATE1 DB AS POSTGRES USER sudo -u postgres psql template1 ALTER POSTGRES USER PASSWORD ALTER USER postgres with encrypted password 'your_password'; EDIT pg_hba.conf TO ALLOW CONNECTIONS sudo vim /etc/postgresql/9.3/main/pg_hba.conf RESTART SERVICE sudo service postgresql restart Okay, now I should be able to test reaching it from another machine that has the postgres client software installed. Okay, got it. I use this command to get into it from outside: psql -h ipaddress -U postgres -W Pshwew! Okay, now for the fdw stuff: sudo apt-get install python-pip sudo pip install pgxnclient sudo pgxnclient install foreign_table_exposer postgresql.config will need this line edited into it: shared_preload_libraries = 'foreign_table_exposer' sudo service postgresql restart Okay, done that. Pshwew! Okay, all that remains is executing the "create table" statements of the foreign data wrapper trick, and magically... data sources. Okay, install git and give the postgres user access to the Dropbox location. Done. Clone my private sql repo. Done. Edit the paths to match new location. sudo apt-get install postgresql-contrib (in console) CREATE EXTENSION file_fdw; CREATE SERVER file_fdw_server FOREIGN DATA WRAPPER file_fdw; \i fdwconnectors.sql BAM! Okay, downloaded trial version of Tableau (again), this time for my PC and connecting... perfect. End of a long journey. It was a rabbit hole, but a very worthwhile one. One's self-esteem gets wrapped up in these things. The trouble ended up being just dotting every last i and crossing every last t. Also, I'm on PostgreSQL 9.3 (instead of .4 or .5) which was a mistake, but it's the default from the Ubuntu 14.04 repo when you don't specify a version on apt-get. And then again, I am on Ubuntu 14.04 and Python 2.7, so being a few versions behind is sometimes a good thing. Lots of tiny little follow-ups to ensure TRUE automation. -------------------------------------------------------------------------------- ## Wed, Jun 01, 2016 9:22:23 AM ### Solving The Fundamental Challenge of Life Shit, didn't get done yesterday at work OR at home anything I had wanted to / planned to get done. I'm going to have to work fast today and at home tonight. I'm going to try to actually get out of here at 5:00 or 5:30 tonight, if at all possible, to try to make up for lost time. I'm getting that feeling of my head being barely above the surface. I have to handle more things in the same amount of time, but still give those things that require single-minded focus the attention they need to get done with some quality and strong foundations for future accomplishments to be built upon. Isn't that always the two equations that have to be simultaneously solved for? Getting into the flow/zone to do exceptional accomplishments is fundamentally incompatible with the daily janitorial maintenance of life that keeps you from becoming a mess. In the end, the one precious irreplaceable commodity is time. There's only so much, and then you're gone, no matter what intellectual constructs you try to prop up around it. And so, it's always a question of time management, and that sucks because it sounds like the un-enjoyable (to me) janitorial functions of life really ends up rising to be the most important thing -- unless you're okay with becoming a single-minded, not well-rounded, obsessive mess. My plan is to walk the vibrating edge in-between. Can I achieve great things with the one greatest thing among them being a profoundly positive influence in the raising and life of my daughter? This trumps all else. Sometimes, it's possible to actually work smarter and not harder, but that's on the back of a life where you front-loaded paying all your dues, so that smarter approaches can actually suggest themselves (which wouldn't necessarily be obvious to other people) when the situation arrives. Be alert for opportunity. And now onto finishing my Amazon t2.micro instance, which I couldn't get a general Internet postgres connection to. I think I want to do it more systematically. First, take a look at how to reset the OS to a fresh install. Oh, this is where I should do a journal-cut. -------------------------------------------------------------------------------- ## Tue, May 31, 2016 2:51:47 PM ### Struggling with PostgreSQL on AMI Connectivity Okay, it's almost 3:00 PM already, and basically all I have is an AWS AMI instance that I can log into, and have a fairly good understanding of what's installed. I have basically 3 things I have to do: - Resolve a hostname to it - Install Dropbox or equivalent - Install PostgreSQL Once those things are done, I should be able to just do the foreign data wrapper magic again, and whenever the files regenerate (on my desktop machine?), they will be updated in-location by Dropbox and just magically be available PostgreSQL data sources again -- but on the outside. Here's a Dropbox support page for installing Dropbox on Linux: https://www.dropbox.com/install?os=lnx Okay, very interesting. Basically just these 2 lines: cd ~ && wget -O - "https://www.dropbox.com/download?plat=lnx.x86_64" | tar xzf - ~/.dropbox-dist/dropboxd And then I had to visit a URL to activate it. After that, the instructions tell you to get https://www.dropbox.com/download?dl=packages/dropbox.py and control dropbox from the command line. I will now exclude everything except for the pulse directory. Okay, I have Dropbox working and tested on my free tier aws ami vm. Good. Now to install postgres and such. Let me re-list the steps. I know it was apt-get, but I will be able to adapt it. apt-get install postgresql-server-dev-all pip install pgxnclient sudo pgxnclient install foreign_table_exposer postgresql.config will need this line edited into it: shared_preload_libraries = 'foreign_table_exposer' Hmmm. Okay, this is going to have much less requirements than the machine that's actually generating the CSV files (have to get down scheduling on my machine). But it's looking like: postgresql94-server.x86_64 : The programs needed to create and run a PostgreSQL server postgresql94-devel.x86_64 : PostgreSQL development header files and libraries python27-psycopg2.x86_64 : A PostgreSQL database adapter for Python (for later) Okay, I think I have all my ducks in a row. Now, trigger of the yum installs. Here's my guide: https://wiki.postgresql.org/wiki/YUM_Installation sudo yum install postgresql postgresql-server postgresql-devel postgresql-contrib postgresql-docs And now, I edit a file per: https://github.com/snowplow/snowplow/wiki/Setting-up-PostgreSQL sudo vim /var/lib/pgsql9/data/pg_hba.conf sudo vim /var/lib/pgsql9/data/posgresql.conf sudo /etc/init.d/postgresql94 start OK! Woot! I still have the init.d system, even though it's not a Debian-based Linux. Look into that. Do Redhat derivatives use that too? What is the AWS AMI derived from? Oh, and go into psql and look at the system table list: sudo -u postgres -i service postgresql initdb psql -U postgres \dS And there they are! Okay, I've got a PostgreSQL instance on AMI, and it appears to be working. I have to check external connectivity, and... and what? Oh, get the foreign data wrapper stuff installed, and try querying against the CSV files from SQLWorkbench/J from my local desktop. 1, 2, 3... 1? Try connecting now with SQLWorkbench. sudo su postgres psql -U postgres ALTER USER postgres WITH PASSWORD 'somepassword'; Ugh. No success connecting to it, and the day is rapidly running out, and I'm having dinner with Adi tonight. Okay, think this all through. This is that panicky feeling that just doesn't have to exist. This is just something to be worked through systematically and methodically. There are no obstacles like this that are too big. If it comes to it, you can work on it at home now. I definitely dislike the process right now. I think I want to script the whole thing. -------------------------------------------------------------------------------- ## Tue, May 31, 2016 10:04:24 AM ### Going To Use AWS AMI For My Next Cloud Server Oh, have to get my "Monday" morning report ready for boss. Okay, got that out. Totally clear, I have to push hard to make everything come together just-so. Hmmmm. How do I REALLY want to play my next step? Maybe it's time to take-up the Boto AWS Python package instead of messing around with Wable. 5 servers for $8/mo was pretty awesome, but one server for $4/mo not so much so. Look back at Amazon's pricing situation, and my lowest point-of-entry for doing Boto stuff. If I'm going to spin up a new server today, WHY NOT make it on AWS, and maybe even tell the tech team, they can just take over this image from me? Hmmmm. I can use my deployment tool to spin up a new image on their side -- just run whatever boto deployment process I create AS THEM, and the image will be theirs. Yup, that's the way to do it. Being tired of the $.50/mo bottom-tier AWS service that the "free tier" switches over to automatically after a year, I had turned it off and discontinued all my AWS services -- essentially shutting it down completely enough to make my AWS account completely inactive. And I can't activate the free tier again, because my account has been suspended. Haha, these are the types of things that make Rackspace and others look appealing even in light of the behemoth that is AWS. Okay, sign up for the free tier again with a new account. #### AWS Free Tier - Compute Amazon EC2: 750hrs/month free for 1 year - Storage Amazon: 5GB - S3 Database Amazon RDS: 750hrs/month Okay, just went through the phone activation rigmarole. I should probably pull Paige into it from these very early steps -- lots of good learnings. But also, I have to just go as fast as possible right now -- my documentation concession is only what I write here. Make your key discoveries and decisions. First key decision is to stay very close to the well-beaten track in this case. Don't go off-rails on this project. Do these things the way Amazon WANTS us to do these things. This means: Start with an AMI instance... Okay. You are using the following Amazon EC2 resources in the US West (Oregon) region: (non active) Next, I select Launch Instance. The options it gives are: - Amazon Linux AMI 2016.03.1 (HVM), SSD Volume Type - Red Hat Enterprise Linux 7.2 (HVM), SSD Volume Type - SUSE Linux Enterprise Server 12 SP1 (HVM), SSD Volume Type - Ubuntu Server 14.04 LTS (HVM), SSD Volume Type - Microsoft Windows Server 2012 R2 Base Okay, hmmm. Not so obvious to choose AMI... what's it's repo system? Oh, it uses yum. Here's the ami faq: https://aws.amazon.com/amazon-linux-ami/faqs/ Wouldn't things go much more smoothly if I just hose Ubuntu and had apt-get at my disposal? Oh, the description of the free tier AMI also includes: > The Amazon Linux AMI is an EBS-backed, AWS-supported image. The default image > includes AWS command line tools, Python, Ruby, Perl, and Java. The repositories > include Docker, PHP, MySQL, PostgreSQL, and other packages. Hmm, there's also Amazon RDS, which gives the option of: - Amazon Aurora - MySQL - MariaDB (a new one to me) - PostgreSQL - Oracle - SQL Server Hmm, learn more. Ahh! PaaS vs IaaS. Choose IaaS. Your database isn't going to become so big, and your IaaS experience is gold. You WANT to have to be the sysadmin and devops person. That's a big part of what you're in it for. Use the boto package against a single free-tier AMI instance with the ability to promote yourself to superuser privileges on those occasions when you need to. And this is the AWS experience that... well, falls into the same category of all these other highly valuable skills that half half-a-foot in the proprietary world. Python APIs make the occasional dabbling in proprietary okay, because how hard will it really be to adapt from Python boto deployment scripts to Python OpenStack scripts? Oh, speaking of deployment, do a little quick investigation. How best to go about this? Okay, it's a General Purpose t2.micro (as opposed to a smaller nano and a larger small). Okay, I selected Review and Launch, and am presented by this warning: > Improve your instances' security. Your security group, launch-wizard-1, is > open to the world. Your instances may be accessible from any IP address. > We recommend that you update your security group rules to allow access from > known IP addresses only. You can also open additional ports in your > security group to facilitate access to the application or service you're > running, e.g., HTTP (80) for web servers. It is almost certainly security context like that that's cutting me off from work's Redshift connection from any machines not on the internal network. Makes sense, and is a sensible precaution. Okay, now I'm creating a new key pair. Moved my .pem file into my Cygwin .ssh location. Okay, and I had to chmod 400 it. And once logged in (which went smoothly), I was prompted to run: sudo yum update Okay, what's installed on this thing? Per http://docs.aws.amazon.com/AWSEC2/latest/UserGuide/find-software.html sudo yum grouplist Loaded plugins: priorities, update-motd, upgrade-helper There is no installed groups file. Maybe run: yum groups mark convert (see man yum) Available Groups: Console internet tools DNS Name Server Development Libraries Development tools Editors FTP Server Java Development Legacy UNIX compatibility Mail Server MySQL Database MySQL Database client NFS file server Network Servers Networking Tools PHP Support Performance Tools Perl Support PostgreSQL Database client (version 8) PostgreSQL Database server (version 8) Scientific support System Tools TeX support Technical Writing Web Server Web Servlet Engine Done And a yum list installed: yum list installed Installed Packages Loaded plugins: priorities, update-motd, upgrade-helper Installed Packages acl.x86_64 2.2.49-6.11.amzn1 installed acpid.x86_64 1.0.10-2.1.6.amzn1 installed alsa-lib.x86_64 1.0.22-3.9.amzn1 installed at.x86_64 3.1.10-44.13.amzn1 installed attr.x86_64 2.4.46-12.10.amzn1 installed audit.x86_64 2.4.1-5.27.amzn1 installed audit-libs.x86_64 2.4.1-5.27.amzn1 installed authconfig.x86_64 6.2.8-9.27.amzn1 installed autogen-libopts.x86_64 5.18-5.8.amzn1 installed aws-amitools-ec2.noarch 1.5.7-1.0.amzn1 installed aws-apitools-as.noarch 1.0.61.6-1.0.amzn1 installed aws-apitools-common.noarch 1.1.0-1.9.amzn1 installed aws-apitools-ec2.noarch 1.7.3.0-1.0.amzn1 installed aws-apitools-elb.noarch 1.0.35.0-1.0.amzn1 installed aws-apitools-mon.noarch 1.0.20.0-1.0.amzn1 installed aws-cfn-bootstrap.noarch 1.4-11.6.amzn1 @amzn-updates aws-cli.noarch 1.10.8-1.37.amzn1 installed basesystem.noarch 10.0-4.9.amzn1 installed bash.x86_64 4.2.46-19.35.amzn1 installed bc.x86_64 1.06.95-1.10.amzn1 installed bind-libs.x86_64 32:9.8.2-0.37.rc1.45.amzn1 installed bind-utils.x86_64 32:9.8.2-0.37.rc1.45.amzn1 installed binutils.x86_64 2.23.52.0.1-55.65.amzn1 installed bzip2.x86_64 1.0.6-8.12.amzn1 installed bzip2-libs.x86_64 1.0.6-8.12.amzn1 installed ca-certificates.noarch 2015.2.6-65.0.1.15.amzn1 @amzn-updates checkpolicy.x86_64 2.1.10-1.9.amzn1 installed chkconfig.x86_64 1.3.49.3-2.14.amzn1 installed cloud-disk-utils.noarch 0.27-1.5.amzn1 installed cloud-init.noarch 0.7.6-2.11.amzn1 installed coreutils.x86_64 8.22-15.52.amzn1 installed cpio.x86_64 2.10-12.12.amzn1 installed cracklib.x86_64 2.8.16-4.14.amzn1 installed cracklib-dicts.x86_64 2.8.16-4.14.amzn1 installed cronie.x86_64 1.4.4-12.6.amzn1 installed cronie-anacron.x86_64 1.4.4-12.6.amzn1 installed crontabs.noarch 1.10-33.9.amzn1 installed cryptsetup.x86_64 1.6.6-3.21.amzn1 installed cryptsetup-libs.x86_64 1.6.6-3.21.amzn1 installed curl.x86_64 7.40.0-8.57.amzn1 @amzn-updates cyrus-sasl.x86_64 2.1.23-13.16.amzn1 installed cyrus-sasl-lib.x86_64 2.1.23-13.16.amzn1 installed cyrus-sasl-plain.x86_64 2.1.23-13.16.amzn1 installed dash.x86_64 0.5.5.1-4.5.amzn1 installed db4.x86_64 4.7.25-18.11.amzn1 installed db4-utils.x86_64 4.7.25-18.11.amzn1 installed dbus.x86_64 1:1.6.12-8.27.amzn1 installed dbus-libs.x86_64 1:1.6.12-8.27.amzn1 installed dejavu-fonts-common.noarch 2.33-6.6.amzn1 installed dejavu-sans-fonts.noarch 2.33-6.6.amzn1 installed dejavu-serif-fonts.noarch 2.33-6.6.amzn1 installed device-mapper.x86_64 1.02.93-3.26.amzn1 installed device-mapper-event.x86_64 1.02.93-3.26.amzn1 installed device-mapper-event-libs.x86_64 1.02.93-3.26.amzn1 installed device-mapper-libs.x86_64 1.02.93-3.26.amzn1 installed device-mapper-persistent-data.x86_64 0.3.2-1.7.amzn1 installed dhclient.x86_64 12:4.1.1-43.P1.24.amzn1 installed dhcp-common.x86_64 12:4.1.1-43.P1.24.amzn1 installed diffutils.x86_64 3.3-4.15.amzn1 installed dmraid.x86_64 1.0.0.rc16-11.8.amzn1 installed dmraid-events.x86_64 1.0.0.rc16-11.8.amzn1 installed dracut.noarch 004-336.28.amzn1 installed dracut-modules-growroot.noarch 0.20-1.5.amzn1 installed dump.x86_64 1:0.4-0.6.b42.7.amzn1 installed e2fsprogs.x86_64 1.42.12-4.40.amzn1 installed e2fsprogs-libs.x86_64 1.42.12-4.40.amzn1 installed ec2-net-utils.noarch 0.4-1.25.amzn1 installed ec2-utils.noarch 0.4-1.25.amzn1 installed ed.x86_64 1.1-3.3.8.amzn1 installed elfutils-libelf.x86_64 0.163-3.18.amzn1 installed epel-release.noarch 6-8.9.amzn1 installed ethtool.x86_64 2:3.15-2.27.amzn1 installed expat.x86_64 2.1.0-8.18.amzn1 installed file.x86_64 5.22-4.31.amzn1 installed file-libs.x86_64 5.22-4.31.amzn1 installed filesystem.x86_64 2.4.30-3.8.amzn1 installed findutils.x86_64 1:4.4.2-6.9.amzn1 installed fipscheck.x86_64 1.3.1-3.13.amzn1 installed fipscheck-lib.x86_64 1.3.1-3.13.amzn1 installed fontconfig.x86_64 2.8.0-5.8.amzn1 installed fontpackages-filesystem.noarch 1.41-1.1.2.amzn1 installed freetype.x86_64 2.3.11-15.14.amzn1 installed gawk.x86_64 3.1.7-10.10.amzn1 installed gdbm.x86_64 1.8.0-36.6.amzn1 installed gdisk.x86_64 0.8.10-1.5.amzn1 installed generic-logos.noarch 17.0.0-2.5.amzn1 installed get_reference_source.noarch 1.2-0.4.amzn1 installed giflib.x86_64 4.1.6-3.1.6.amzn1 installed glib2.x86_64 2.36.3-5.18.amzn1 installed glibc.x86_64 2.17-106.167.amzn1 installed glibc-common.x86_64 2.17-106.167.amzn1 installed gmp.x86_64 6.0.0-11.16.amzn1 installed gnupg2.x86_64 2.0.28-1.30.amzn1 installed gpgme.x86_64 1.4.3-5.15.amzn1 installed gpm-libs.x86_64 1.20.6-12.8.amzn1 installed grep.x86_64 2.20-1.16.amzn1 installed groff.x86_64 1.22.2-8.11.amzn1 installed groff-base.x86_64 1.22.2-8.11.amzn1 installed grub.x86_64 1:0.97-94.30.amzn1 installed grubby.x86_64 7.0.15-5.7.amzn1 installed gzip.x86_64 1.5-8.18.amzn1 installed hesiod.x86_64 3.1.0-19.6.amzn1 installed hmaccalc.x86_64 0.9.12-1.9.amzn1 installed hwdata.noarch 0.233-14.1.18.amzn1 installed info.x86_64 5.1-4.10.amzn1 installed initscripts.x86_64 9.03.49-1.34.amzn1 installed iproute.x86_64 4.4.0-3.23.amzn1 installed iptables.x86_64 1.4.18-1.22.amzn1 installed iputils.x86_64 20121221-7.13.amzn1 @amzn-updates irqbalance.x86_64 2:1.0.8-1.23.amzn1 installed java-1.7.0-openjdk.x86_64 1:1.7.0.101-2.6.6.1.67.amzn1 installed javapackages-tools.noarch 0.9.1-1.5.amzn1 installed jpackage-utils.noarch 1.7.5-27.17.amzn1 installed kbd.x86_64 1.15-11.4.amzn1 installed kbd-misc.noarch 1.15-11.4.amzn1 installed kernel.x86_64 4.4.8-20.46.amzn1 installed kernel.x86_64 4.4.10-22.54.amzn1 @amzn-updates kernel-tools.x86_64 4.4.10-22.54.amzn1 @amzn-updates keyutils.x86_64 1.5.8-3.12.amzn1 installed keyutils-libs.x86_64 1.5.8-3.12.amzn1 installed kmod.x86_64 14-10.10.amzn1 installed kmod-libs.x86_64 14-10.10.amzn1 installed kpartx.x86_64 0.4.9-72.8.amzn1 installed krb5-libs.x86_64 1.13.2-12.40.amzn1 installed lcms2.x86_64 2.5-4.4.amzn1 installed less.x86_64 436-13.12.amzn1 installed libICE.x86_64 1.0.6-1.4.amzn1 installed libSM.x86_64 1.2.1-2.6.amzn1 installed libX11.x86_64 1.6.0-2.2.12.amzn1 installed libX11-common.x86_64 1.6.0-2.2.12.amzn1 installed libXau.x86_64 1.0.6-4.9.amzn1 installed libXcomposite.x86_64 0.4.3-4.6.amzn1 installed libXext.x86_64 1.3.2-2.1.10.amzn1 installed libXfont.x86_64 1.4.5-5.12.amzn1 installed libXi.x86_64 1.7.2-2.2.9.amzn1 installed libXrender.x86_64 0.9.8-2.1.9.amzn1 installed libXtst.x86_64 1.2.2-2.1.9.amzn1 installed libacl.x86_64 2.2.49-6.11.amzn1 installed libaio.x86_64 0.3.109-12.8.amzn1 installed libassuan.x86_64 2.0.3-3.3.amzn1 installed libattr.x86_64 2.4.46-12.10.amzn1 installed libblkid.x86_64 2.23.2-22.26.amzn1 installed libcap.x86_64 2.16-5.5.8.amzn1 installed libcap-ng.x86_64 0.7.3-5.13.amzn1 installed libcgroup.x86_64 0.40.rc1-5.11.amzn1 installed libcom_err.x86_64 1.42.12-4.40.amzn1 installed libcurl.x86_64 7.40.0-8.57.amzn1 @amzn-updates libedit.x86_64 2.11-4.20080712cvs.1.6.amzn1 installed libevent.x86_64 2.0.18-1.11.amzn1 installed libffi.x86_64 3.0.13-11.4.amzn1 installed libfontenc.x86_64 1.0.5-2.6.amzn1 installed libgcc48.x86_64 4.8.3-9.109.amzn1 installed libgcrypt.x86_64 1.5.3-12.18.amzn1 installed libgpg-error.x86_64 1.11-1.12.amzn1 installed libgssglue.x86_64 0.1-11.7.amzn1 installed libicu.x86_64 50.1.2-11.12.amzn1 installed libidn.x86_64 1.18-2.8.amzn1 installed libjpeg-turbo.x86_64 1.2.90-5.14.amzn1 installed libmount.x86_64 2.23.2-22.26.amzn1 installed libnfsidmap.x86_64 0.25-11.10.amzn1 installed libnih.x86_64 1.0.1-7.8.amzn1 installed libnl.x86_64 1.1.4-2.10.amzn1 installed libpipeline.x86_64 1.2.3-3.3.amzn1 installed libpng.x86_64 2:1.2.49-2.14.amzn1 installed libpsl.x86_64 0.6.2-1.2.amzn1 installed libpwquality.x86_64 1.2.3-4.8.amzn1 installed libselinux.x86_64 2.1.10-3.22.amzn1 installed libselinux-utils.x86_64 2.1.10-3.22.amzn1 installed libsemanage.x86_64 2.1.6-3.13.amzn1 installed libsepol.x86_64 2.1.7-3.12.amzn1 installed libss.x86_64 1.42.12-4.40.amzn1 installed libssh2.x86_64 1.4.2-2.13.amzn1 installed libstdc++48.x86_64 4.8.3-9.109.amzn1 installed libsysfs.x86_64 2.1.0-7.10.amzn1 installed libtasn1.x86_64 2.3-6.6.amzn1 installed libtirpc.x86_64 0.2.4-0.3.13.amzn1 installed libudev.x86_64 173-4.13.amzn1 installed libuser.x86_64 0.60-7.23.amzn1 installed libutempter.x86_64 1.1.5-4.1.6.amzn1 installed libuuid.x86_64 2.23.2-22.26.amzn1 installed libverto.x86_64 0.2.5-4.9.amzn1 installed libxcb.x86_64 1.8.1-1.18.amzn1 installed libxml2.x86_64 2.9.1-6.2.50.amzn1 installed libxml2-python27.x86_64 2.9.1-6.2.50.amzn1 installed libxslt.x86_64 1.1.28-5.12.amzn1 installed libyaml.x86_64 0.1.6-6.7.amzn1 installed logrotate.x86_64 3.7.8-17.13.amzn1 installed lsof.x86_64 4.82-4.10.amzn1 installed lua.x86_64 5.1.4-4.1.9.amzn1 installed lvm2.x86_64 2.02.115-3.26.amzn1 installed lvm2-libs.x86_64 2.02.115-3.26.amzn1 installed mailcap.noarch 2.1.31-2.7.amzn1 installed make.x86_64 1:3.82-21.10.amzn1 installed man-db.x86_64 2.6.3-9.3.amzn1 installed man-pages.noarch 4.04-2.15.amzn1 installed mdadm.x86_64 3.2.6-7.32.amzn1 installed mingetty.x86_64 1.08-5.9.amzn1 installed nano.x86_64 2.5.3-1.19.amzn1 installed nc.x86_64 1.84-24.8.amzn1 installed ncurses.x86_64 5.7-3.20090208.13.amzn1 installed ncurses-base.x86_64 5.7-3.20090208.13.amzn1 installed ncurses-libs.x86_64 5.7-3.20090208.13.amzn1 installed net-tools.x86_64 1.60-110.10.amzn1 installed newt.x86_64 0.52.11-3.11.amzn1 installed newt-python27.x86_64 0.52.11-3.11.amzn1 installed nfs-utils.x86_64 1:1.3.0-0.21.amzn1 installed nspr.x86_64 4.11.0-1.37.amzn1 @amzn-updates nss.x86_64 3.21.0-9.76.amzn1 @amzn-updates nss-softokn.x86_64 3.16.2.3-14.2.38.amzn1 @amzn-updates nss-softokn-freebl.x86_64 3.16.2.3-14.2.38.amzn1 @amzn-updates nss-sysinit.x86_64 3.21.0-9.76.amzn1 @amzn-updates nss-tools.x86_64 3.21.0-9.76.amzn1 @amzn-updates nss-util.x86_64 3.21.0-2.2.50.amzn1 @amzn-updates ntp.x86_64 4.2.6p5-36.29.amzn1 installed ntpdate.x86_64 4.2.6p5-36.29.amzn1 installed ntsysv.x86_64 1.3.49.3-2.14.amzn1 installed numactl.x86_64 2.0.7-8.11.amzn1 installed openldap.x86_64 2.4.40-7.28.amzn1 installed openssh.x86_64 6.6.1p1-25.61.amzn1 installed openssh-clients.x86_64 6.6.1p1-25.61.amzn1 installed openssh-server.x86_64 6.6.1p1-25.61.amzn1 installed openssl.x86_64 1:1.0.1k-14.91.amzn1 installed p11-kit.x86_64 0.18.5-2.3.amzn1 installed p11-kit-trust.x86_64 0.18.5-2.3.amzn1 installed pam.x86_64 1.1.8-12.33.amzn1 installed pam_ccreds.x86_64 10-4.9.amzn1 installed pam_krb5.x86_64 2.3.11-9.12.amzn1 installed pam_passwdqc.x86_64 1.0.5-6.8.amzn1 installed parted.x86_64 2.1-21.18.amzn1 installed passwd.x86_64 0.79-4.13.amzn1 installed pciutils.x86_64 3.1.10-4.11.amzn1 installed pciutils-libs.x86_64 3.1.10-4.11.amzn1 installed pcre.x86_64 8.21-7.7.amzn1 installed perl.x86_64 4:5.16.3-283.37.amzn1 installed perl-Carp.noarch 1.26-244.5.amzn1 installed perl-Digest.noarch 1.17-245.5.amzn1 installed perl-Digest-HMAC.noarch 1.03-5.7.amzn1 installed perl-Digest-MD5.x86_64 2.52-3.5.amzn1 installed perl-Digest-SHA.x86_64 1:5.85-3.5.amzn1 installed perl-Encode.x86_64 2.51-7.5.amzn1 installed perl-Exporter.noarch 5.68-3.5.amzn1 installed perl-File-Path.noarch 2.09-2.5.amzn1 installed perl-File-Temp.noarch 0.23.01-3.5.amzn1 installed perl-Filter.x86_64 1.49-3.5.amzn1 installed perl-Getopt-Long.noarch 2.40-2.5.amzn1 installed perl-HTTP-Tiny.noarch 0.033-3.6.amzn1 installed perl-PathTools.x86_64 3.40-5.5.amzn1 installed perl-Pod-Escapes.noarch 1:1.04-283.37.amzn1 installed perl-Pod-Perldoc.noarch 3.20-4.7.amzn1 installed perl-Pod-Simple.noarch 1:3.28-4.6.amzn1 installed perl-Pod-Usage.noarch 1.63-3.5.amzn1 installed perl-Scalar-List-Utils.x86_64 1.27-248.5.amzn1 installed perl-Socket.x86_64 2.010-3.5.amzn1 installed perl-Storable.x86_64 2.45-3.5.amzn1 installed perl-Text-ParseWords.noarch 3.29-4.5.amzn1 installed perl-Time-Local.noarch 1.2300-2.5.amzn1 installed perl-constant.noarch 1.27-2.5.amzn1 installed perl-libs.x86_64 4:5.16.3-283.37.amzn1 installed perl-macros.x86_64 4:5.16.3-283.37.amzn1 installed perl-parent.noarch 1:0.225-244.5.amzn1 installed perl-podlators.noarch 2.5.1-3.8.amzn1 installed perl-threads.x86_64 1.87-4.5.amzn1 installed perl-threads-shared.x86_64 1.43-6.5.amzn1 installed pinentry.x86_64 0.7.6-6.11.amzn1 installed pkgconfig.x86_64 1:0.27.1-2.7.amzn1 installed policycoreutils.x86_64 2.1.12-5.23.amzn1 installed popt.x86_64 1.13-7.7.amzn1 installed procmail.x86_64 3.22-25.1.6.amzn1 installed procps.x86_64 3.2.8-30.14.amzn1 installed psacct.x86_64 6.3.2-63.8.amzn1 installed psmisc.x86_64 22.20-8.12.amzn1 installed pth.x86_64 2.0.7-9.3.7.amzn1 installed python27.x86_64 2.7.10-4.120.amzn1 installed python27-PyYAML.x86_64 3.10-3.10.amzn1 installed python27-babel.noarch 0.9.4-5.1.8.amzn1 installed python27-backports.x86_64 1.0-3.14.amzn1 installed python27-backports-ssl_match_hostname3.4.0.2-1.12.amzn1 installed python27-boto.noarch 2.39.0-1.0.amzn1 installed python27-botocore.noarch 1.3.30-1.50.amzn1 installed python27-chardet.noarch 2.0.1-7.7.amzn1 installed python27-colorama.noarch 0.2.5-1.7.amzn1 installed python27-configobj.noarch 4.7.2-7.15.amzn1 installed python27-crypto.x86_64 2.6.1-1.12.amzn1 installed python27-daemon.noarch 1.5.2-1.5.amzn1 installed python27-dateutil.noarch 2.1-1.3.amzn1 installed python27-devel.x86_64 2.7.10-4.120.amzn1 installed python27-docutils.noarch 0.11-1.15.amzn1 installed python27-ecdsa.noarch 0.11-3.3.amzn1 installed python27-imaging.x86_64 1.1.6-19.9.amzn1 installed python27-iniparse.noarch 0.3.1-2.1.9.amzn1 installed python27-jinja2.noarch 2.7.2-2.15.amzn1 installed python27-jmespath.noarch 0.7.1-1.9.amzn1 installed python27-jsonpatch.noarch 1.2-2.5.amzn1 installed python27-jsonpointer.noarch 1.0-3.4.amzn1 installed python27-kitchen.noarch 1.1.1-5.6.amzn1 installed python27-libs.x86_64 2.7.10-4.120.amzn1 installed python27-lockfile.noarch 0.8-3.5.amzn1 installed python27-markupsafe.x86_64 0.11-4.6.amzn1 installed python27-paramiko.noarch 1.15.1-1.5.amzn1 installed python27-pip.noarch 6.1.1-1.21.amzn1 installed python27-ply.noarch 3.4-3.12.amzn1 installed python27-pyasn1.noarch 0.1.7-2.9.amzn1 installed python27-pycurl.x86_64 7.19.0-17.12.amzn1 installed python27-pygpgme.x86_64 0.3-9.12.amzn1 installed python27-pyliblzma.x86_64 0.5.3-11.6.amzn1 installed python27-pystache.noarch 0.5.3-2.8.amzn1 installed python27-pyxattr.x86_64 0.5.0-1.6.amzn1 installed python27-requests.noarch 1.2.3-5.10.amzn1 installed python27-rsa.noarch 3.3-2.7.amzn1 installed python27-setuptools.noarch 12.2-1.30.amzn1 installed python27-simplejson.x86_64 3.6.5-1.12.amzn1 installed python27-six.noarch 1.8.0-1.23.amzn1 installed python27-urlgrabber.noarch 3.9.1-9.13.amzn1 installed python27-urllib3.noarch 1.8.2-1.5.amzn1 installed python27-virtualenv.noarch 12.0.7-1.12.amzn1 installed quota.x86_64 1:4.00-7.18.amzn1 installed quota-nls.noarch 1:4.00-7.18.amzn1 installed readline.x86_64 6.2-9.14.amzn1 installed rmt.x86_64 1:0.4-0.6.b42.7.amzn1 installed rng-tools.x86_64 5-7.12.amzn1 installed rootfiles.noarch 8.1-6.1.8.amzn1 installed rpcbind.x86_64 0.2.0-11.8.amzn1 installed rpm.x86_64 4.11.2-2.73.amzn1 installed rpm-build-libs.x86_64 4.11.2-2.73.amzn1 installed rpm-libs.x86_64 4.11.2-2.73.amzn1 installed rpm-python27.x86_64 4.11.2-2.73.amzn1 installed rsync.x86_64 3.0.6-12.13.amzn1 installed rsyslog.x86_64 5.8.10-9.26.amzn1 installed ruby.noarch 1:2.0-0.3.amzn1 installed ruby20.x86_64 2.0.0.648-1.29.amzn1 installed ruby20-irb.noarch 2.0.0.648-1.29.amzn1 installed ruby20-libs.x86_64 2.0.0.648-1.29.amzn1 installed rubygem20-bigdecimal.x86_64 1.2.0-1.29.amzn1 installed rubygem20-json.x86_64 1.8.3-1.51.amzn1 installed rubygem20-psych.x86_64 2.0.0-1.29.amzn1 installed rubygem20-rdoc.noarch 4.2.2-1.43.amzn1 installed rubygems20.noarch 2.0.14.1-1.29.amzn1 installed screen.x86_64 4.0.3-16.5.amzn1 installed sed.x86_64 4.2.1-10.10.amzn1 installed sendmail.x86_64 8.14.4-8.12.amzn1 installed setserial.x86_64 2.17-25.7.amzn1 installed setup.noarch 2.8.14-20.12.amzn1 installed sgpio.x86_64 1.2.0.10-5.7.amzn1 installed shadow-utils.x86_64 2:4.1.4.2-13.10.amzn1 installed shared-mime-info.x86_64 1.1-7.7.amzn1 installed slang.x86_64 2.2.1-1.8.amzn1 installed sqlite.x86_64 3.7.17-6.13.amzn1 installed sudo.x86_64 1.8.6p3-20.22.amzn1 @amzn-updates sysctl-defaults.noarch 1.0-1.1.amzn1 installed sysfsutils.x86_64 2.1.0-7.10.amzn1 installed system-release.noarch 2016.03-0.5 installed sysvinit.x86_64 2.87-6.dsf.15.amzn1 installed tar.x86_64 2:1.26-31.22.amzn1 installed tcp_wrappers.x86_64 7.6-75.11.amzn1 installed tcp_wrappers-libs.x86_64 7.6-75.11.amzn1 installed time.x86_64 1.7-38.9.amzn1 installed tmpwatch.x86_64 2.9.16-4.10.amzn1 installed traceroute.x86_64 3:2.0.14-2.7.amzn1 installed ttmkfdir.x86_64 3.0.9-32.1.5.amzn1 installed tzdata.noarch 2016d-1.62.amzn1 @amzn-updates tzdata-java.noarch 2016d-1.62.amzn1 @amzn-updates udev.x86_64 173-4.13.amzn1 installed unzip.x86_64 6.0-2.9.amzn1 installed update-motd.noarch 1.0.1-3.0.amzn1 installed upstart.x86_64 0.6.5-13.3.13.amzn1 installed ustr.x86_64 1.0.4-9.1.6.amzn1 installed util-linux.x86_64 2.23.2-22.26.amzn1 installed vim-common.x86_64 2:7.4.1759-1.40.amzn1 installed vim-enhanced.x86_64 2:7.4.1759-1.40.amzn1 installed vim-filesystem.x86_64 2:7.4.1759-1.40.amzn1 installed vim-minimal.x86_64 2:7.4.1759-1.40.amzn1 installed wget.x86_64 1.17.1-1.17.amzn1 installed which.x86_64 2.19-6.10.amzn1 installed words.noarch 3.0-17.8.amzn1 installed xorg-x11-font-utils.x86_64 1:7.2-11.5.amzn1 installed xorg-x11-fonts-Type1.noarch 7.2-9.1.5.amzn1 installed xz.x86_64 5.1.2-12alpha.12.amzn1 installed xz-libs.x86_64 5.1.2-12alpha.12.amzn1 installed yum.noarch 3.4.3-137.65.amzn1 installed yum-metadata-parser.x86_64 1.1.4-8.12.amzn1 installed yum-plugin-priorities.noarch 1.1.31-29.26.amzn1 installed yum-plugin-upgrade-helper.noarch 1.1.31-29.26.amzn1 installed yum-utils.noarch 1.1.31-29.26.amzn1 installed zip.x86_64 3.0-1.10.amzn1 installed zlib.x86_64 1.2.8-7.18.amzn1 installed Wow, these things are pretty friggn' loaded for "minimal" distros. But it does look like the most common requirements for stuff, so it's surely designed to minimize dependency nightmares while still keeping the image fairly small. Tue, May 31, 2016 1:12:54 PM Okay, got distracted debugging an issue in the report, and in 15-minutes I've got my menu update meeting. Also, I'm going to be meeting with Adi for dinner tonight, screwing both the ability to work late AND do anything on the homefront by leaving promptly at 6:00 PM today, as I had planned. Ugh! Okay, the only way to handle this that sort of thing in a way that I've been able to pull off before is to get such such a firm grip on the reins that no matter which way the horse bucks, I have felt it coming, accommodate fast, and keep everything running smoothly WHILE demonstrating a little bit of my firm control over the whole situation to all onlookers. Okay, I actually have a login on my PC to the AWS AMI instance, and... and... I'm probably going to lose it in a few minutes when I undock my Windows machine to go to the menu meeting. Okay, I am as not-in-the-zone right now as one can be. At least I can still imagine the prize for today, even if I got knocked off its trail. My own fault by the reports not being flawless already. It's funny how debugging can hold you back from your own forward movement. I still don't know what went wrong in those few rows that got messed up. Okay, got it. It is in fact duplicate keywords, like when it showed up in the URL columns. Shit, I can't show keywords twice. Just use sets to keep any keyword being encountered from being added again. Different solution than the duplicate title tags, for sure. -------------------------------------------------------------------------------- ## Tue, May 31, 2016 9:51:43 AM ### It All Leads Up To Adi's Invention Shop I had attempted to make my "really big" impact repeatedly in the past, and each time, for some reason or another (usually, my own), they had fizzled. First, in "saving" Commodore and the Amiga... perhaps among the worthiest of the causes, but the most donquixotesque attempt of the bunch. Next came Scala Multimedia (later, to become Scala Digital Signage) which I had to drag kicking and screaming to profitability (an expression that came up WITH THEM more than once). Even once attaining profitability, they still were to all appearances the most thankless and clueless bunch I ever interacted with. Instead of recognizing what I built (which by that time ran everything but accounting) as an innovative sort of proto-Ruby-on-Rails, they just piled on the criticism of my techniques. A lot of energy went into counteracting my efforts, instead of just shoveled into sales follow-up, which was what I told them to go do whenever they tried to engage in circular deconstruction (more on that later, I'm sure). There was just a deep and total lack of sincerity in actually wanting (and "hungering") for success, and making an meaningful difference -- that "ding in the Universe" thing that existed so well in Commodore, but not so much in it's re-basing-on-PC spin-off's like Scala. I guess I can't criticize too much, as they still exist today and appear to be leaders in digital signage. I like to think I played my part in getting them there. Later, came Connors Communications that tempted me out to New York City, and then 360i after that, which was an organization that actually did have its breakout moment WHILE I WAS THERE, but I myself wasn't in the mental or physical state to travel that journey with them -- as I was bogged down in an untenable marriage and having my first kid. So, I was overladed and burned-out, and once again down on the field of SEO -- as much so as when I was first compelled to take-up the label of SEO, which I didn't like very much either. Okay, so here I am at ZD as the Senior SEO Director. It's time to get back into that place where I can make a big, big impact for an organization that (I believe) very much needs what I have to offer... but I have to up-my offerings by quite a few notches. That feeling of moving icebergs from my fingertips needs to be embraced and enhanced. From the very-tiny Levinux-like tricks, to the very large scale deployment and devops sanity-maintenance tricks and big data visualization and insight tricks that are so very hot these days. First become surprisingly competent at them all, and gradually work your way towards mastery in those areas where you really care to, and YOUR DAUGHTER WILL MOST BENEFIT from, you mastering. Fun data visualization stuff? Seeing meaning that others don't? Applying imagination first towards data-sets, and then equipped with critical (unfair advantage) insights in hand, then applying imagination towards true invention -- like the invention shop she always talks about. GO OUT OF YOUR WAY to encourage that. -------------------------------------------------------------------------------- ## Tue, May 31, 2016 9:22:45 AM ### Have To Optimize My Time At Home I forgot to bring my key to the drawer that has my Mac in it... again. Ugh! I forget how much I'm coming to rely on having that around as a second rapidly carry-around PC and a journal-typing machine. Oh well -- adaptability! Get those reports running ASAP! You need a backup key system. It's long overdue to be reading Hackers now by Steven Levy. Also, I'm intermixing it with listening to the Partially Derivative data science podcast. I need to keep on my current path, and let the "SEO" work that I'm doing with Python ever-increasingly dovetail with other projects that are of great importance around this place. Once again, my attention is drawn towards the necessity to have a competent, working understanding of the very zoomed-in "tree" details simultaneously with the 40K-foot "forest" big picture. One without the other really leaves you ineffectual as a doer of great things. You either need direct, personal command over those up-close tree details, or you need a trustworthy and motivated network of individuals that you can "automate" to do your bidding down at the tree-level, while you yourself work primarily from, and issue mandates up from that big-picture level. Nope. Tree-details it is for me, for sure. Just don't get lost in the trees. Keep zooming out. Keep rocketing your head up into the clouds to look down sufficiently to hold-steady on a good course overt the long haul. Don't let petty small things stop you, including your own stupid oversights and bad habits. Fix those bad habits and reduce the oversights your own systems WORK FOR YOU as well as you make them to work for your employers. Solve simultaneous equations. Kill many birds with one stone (figuratively). Okay... I didn't think through my next steps with the need to keep my system running and to become the kernel of the true automation FROM HOME as much as I probably should have over this holiday weekend, but I was with Adi, so... and so I thought about it precisely as much as I should have... not at all. What I really need to do now more than anything is to walk out of here at 6:00 PM every night for awhile, and maximize my time getting organized at home. It will lead to thing that leads to thing that leads to thing that will move my life forward in the ways that I need to at this point. -------------------------------------------------------------------------------- ## Mon May 30 22:08:50 EDT 2016 ### Gotta Find Time In The Margins Okay, now you have a wee-bit of energy-sapped time to deal with. What you have to do is as-fast-as-possible, put yourself back on track. Do that quick 80/20 rule cleaning pass of the entire apartment in the little bits of time you have after work, and before your next weekend begins again. Weekends are basically off-limits as a time to get things done on the homefront. And I'm tired and sapped of energy every day after I come home from a long day at work. And so, the difficult equation remains in the original problem that I thought real and true separation would finally achieve. Okay... I do have time. It's just in the margins and the scraps left over, 4 days-a-week... generally. I have to make that work. I have to live a much richer, more effective life, pushing things forward bit-by-bit, in those margins. -------------------------------------------------------------------------------- ## Sun May 29 16:47:00 EDT 2016 ### Memorial Day Weekend With Adi in Inwood It's time to dig myself out of the blit-hole. I let Adi have one of her neighborhood friends over, and one of the first things Adi did to impress her friend was to dump a bucket of coins all over my bedroom floor, after I explicitly told her repeated times that that was not allowed. I had been cleaning coins up off the bedroom floor for weeks prior from the last time she promised the same thing and promptly broke it. And so now, that sort of flaunt it in your face disregard for other peoples wishes, and actual developing meanness will be punished. Her play-date with her friend was over after that, and she had to help me pick up every last coin from the floor. I can't allow her to be mean like that in my presence, much-less actually mean like that to me for the entertainment of her friends. Now, she's in her room sulking, and I'm going to do some of the cleaning that I don't have the energy for after work and don't have the time for on weekends. Adi probably won't recognize the place when she comes out of her bedroom. A big, effective 80/20 rule sweep. Get at least one full sweep done. It always seems that the people in my life are almost in shock and certainly furious anger when the learn that I'm not a push over. Ahh, the repeating patterns of my life. Allowing people to take me for a ride, and then have a very furious rider when they realize their vehicle has a mind of its own -- and a very strong one, at that. Just because I was born to be a work vehicle, doesn't mean I will allow myself to be driven into the ground. Surprise! It's gong to make Adi respect me more than not. When she doesn't get her way, her threats are the meanest of mean, going right for the jugular of most hurtful. Yep, she's smart. But now it's up to me to help her have a little more empathy. Mon May 30 21:59:05 EDT 2016 Well, the conclusion to that is that we went over to their place after my Adi settled down and learned the lesson. If I allow people to push me around in front of other people, with a mean spirited, hey, look at what I can make my Dad do attitude, then she learns the hard way the error in that particular judgement. All good things that come from Daddy can now be, most unpleasantly and unfortunately, conditional on altered behavior. I simply won't tolerate that sort of treating of another human being's feelings as unimportant, and to be readily disregarded for entertainment value. If she's somehow getting those lessons, they will not translate into behavior that she's permitted to indulge in on "Daddy time". Wow, that whole I'm not a push-over... surprise! seems to be a recurring theme of my life, and the reason for many a terminated relationship... usually soon after realizing that I'm not as easy of a push-over as I appear to be at first. I compromise readily on unimportant matters. That doesn't mean that I wouldn't fight to the death over other matters. Not all matters are created equal, and I think the real me only comes out in those rare cases of when what matters to me is in actual conflict with what matters to those immediately around me in my life. Crack! There goes another relationship. Sighhhh. Hopefully, this will not be too terribly upsetting to Adi, realising her dad has a backbone, and will occasionally not let her get her way, but it takes/took her behaving meanly. -------------------------------------------------------------------------------- ## Fri May 27 09:38:29 EDT 2016 ### Levinux Is a Hack Even if you don't think you need to, you should always get in the habit of git pulling first, when you sit down to a new work session. Even if you didn't commit locally, this serves as just a check of where you are in the current directory and repo you're cd'd into, and the repository, usually on Github or your corporate GitLab (Github clone). Well, my Kubuntu VM on my desktop reports the status "Aborted" through the VirtualBox Manager. I didn't abort it, yet that happened overnight, and I don't even see any Windows notifications of a reboot. It could have been that Macaffe one that I noticed yesterday (and screenshot) stating that it needed to reboot my system. Of course, that must have been it. But yet another Murphy's Law... I have to get all my tmux sessions and virtualenv and psql login-adjustments and sessions running again. Sheesh! I'm finally listening to Steve Levy's Hackers on Audible while I code. How meta. I'm trying to develop better problem-solving instincts. I need to think back to where I left off yesterday. I can only imagine what these early-day code bummers would think of my unrestrained verbosity. It only goes to show that it takes all kinds, and the kinds it takes varies according to the times, and the conditions and state of affairs of the time. I'm learning about the history of the Spacewar program... wow, how much I really don't know about computer history. Just as there is computer science, there is computer history. Paper tapes were distributed to distribute Spacewar. And the DEC engineers used Spacewar as a final diagnostic before shipping PDP-1's out the door, and the sales and installation people knew how to fire it back up when the machine got installed at the customer site. There was an original circle of hackers at MIT that started in a model railroad club, and evolved into a few different circles of people, from the first AI-lab to the first video games and such. There were both the students, instructors, and at least one who was just a Cambridge Urchin (a local/non-student) who hung out at MIT. There are different "schools of thought" just being developed here. I'm running the reports now but I'm afraid it's going to take a very long time this morning, as I blanked the titles database. Well at least, it's automated to the length now where there's no post-processing. I should have really thought to run the entire reports last night before I left... sheesh, how stupid! Okay, just bite the bullet and wait it out. You at least get to see everything run. Okay, I'm continuing to write my book going to and from work on the Subway, and I copy-and-paste my notes from SimpleNote. Here's this mornings: # Levinux is a Hack > If hackers are born, then they're going to get made. > And if they're made into it, then they were born. -- Ricky Greenblatt Yes, Levinux is a hack. But hacks like this are great! You too should learn to hack, so you can make machines do whatever you want. It might start with this black box that mysteriously captures your pointer, but it could very well lead to robot armies, lifetime passions, or guaranteed earning power for life. Two paths in the woods diverged, and taking the one less traveled made all the difference, blah blah blah. I'm here to help get you over the hump of getting started with a certain old school way of interacting with computers that still underlies nearly all computers today, and likely into the future. I used to say that Microsoft was the last popular general computer platform left on earth that didn't give into the *nix groundswell, but I can't anymore, as they prepare adding a popular Linux BASH shell to the system. There's a lot of insider technical jargon in there. It all will be explained. Bottom line: the unification of the generic tech plumbing of the Information Age is occurring -- both at the lower system-levels with Unix and Linux-like operating systems, AND at the higher user interface (UI) through Web browsers with the HTM5/CSS3 standards. With this unification comes greater accessibility of all of it to the masses. Yet, somehow it still remains hidden -- in my opinion, merely through the lack of a proper introduction. And so that's where Levinux comes into play; to make the *nix world more visible and tangible and explore-able and exciting, to make you want to hack on it just enough to have the bravery and confidence to hack on other things. Part of instilling bravery to experiment is to prove to you that you can do no harm. So, I kick off the Levinux Education with the notorious command that recursively forces the removal of all files from your entire system -- a command that old hackers joke about and print on T-shirts. Let the new generation of hackers experience this thing for real on Levinux and know what it would do if they tried it on their Macs (and Windows?) machines. Spark curiosity the promise of ancient hidden treasures, Not only do I WANT to provide the Levinux Education to do this right now in my life for the sake of thinking through the technical aspects of my daughter's homeschooling education, but also by mere virtue of seeing it all so viable and within my ability to create and (initially) support right now. Frankly, I think it's necessary at this time in human history. The world needs unified controlled conditions -- a sandboxed cleanroom to play around with the nearly identical underlying parts that all tech is built on these days -- without any fear of breaking things. You can start by hacking Levinux. In doing so, you're taking a journey both into the past and the future. And I think I will get you addicted, and perchance put you on the road to pioneering your own new thing. To get started, you don't need to get anyone's permission. You don't have to worry about breaking anything. You can try over and over quickly. You have everything to gain, so what are you waiting for? # On Use of Tools People like to be able to trust their tools and weapons. And once you learn a few, you don't want to necessarily have to re-learn everything every few years, or even months. Well, that's exactly what you have to do in tech. Can you imagine a master pianist having to deal with the constant moving of the keys and exactly what happens when you hit them? Well, that's what we do in tech. One might assume that this is an unavoidable state in the field of technology, and that tools in tech can't reach as stable and "perfect" a state as a violin or hammer. Truth is, they can, they have, and they do all the time. We're just in the early states of violins and hammers in the world of Information Tech that connects everything together in the Information Age. Some people do actually think differently, occasionally. If you're here, you're probably one of them. Popular notions drift generationally as new and slightly altered ideas develop, shaped by new experienced, the prior generation was never exposed to. But hacked it together, I did. And I'm doing it just in time, too -- while running like this on the desktop is still within context for a great number of people, who still have the desire and curiosity to learn something. So quick, pay attention and listen to what I have to teach, while this trick is still doable. Really thinking about what's going on here shows what a remarkable state we are in as clever monkey inventors. We can make computers that spread like a meme inside networks of other computers. I am doing this in the hope that you will recognize something here that appeals to you -- like an artist finally finding their natural media, as I did... the second time, after the first being the Amiga computer. Tools do make a difference, and tools can let you down, setting you back in like, mostly thanks to vendors and the profit motive. The Amiga was my first true love. I was not alone, as many Amiga freaks like me can attest to. Many of us let go of our Amiga kicking and screaming in out transitions to Macs and PCs -- the enemy. So all us Amiga users who re-platformed onto mainstream OSes are traitors. Everything that has come subsequently has been something of a disappointment. I've been looking for something as love-worthy ever since, with very limited success. Until now. I set new criteria, top-most of which was guarding my heart, so it would no be broken again by companies going under. And surely Microsoft is not going under, so I'll take up Active Server Pages and VBScript... oops, shame on me. My criteria refines. Is it Linux I'm looking for? Kick Mandrake's tires... whoah, cool, but still way too techie for my liking with all the C-compiling and dependency resolution. I have never had much luck with C or its derivatives. I often characterize them as -- and this includes Java -- defeating me. Try Linux again... and even FreeBSD for BSD. Time? Almost. Oh, the Debian repository is cool. And this Github thing... nice. Certain technologies are too old or too everywhere to be called mere fads. Unix and Python are in that category, as are the vi-based text editors, and even now distributed revision control systems, with git chief among them. So, you have to remain flexible ENOUGH. Don't be a Ludite. Do, be flexible, but not TOO flexible as to be taken too deeply in by fad. Check out fads, because they could lead to trends which lead to timeless tools (as did git), but most fads DO NOT go that route. The later chapters even on such things as Ruby have not yet been written, and are enjoying a boost because of one inspired framework that became very popular, and Apple's decision to distribute Ruby natively with the Mac, and the Homebrew people jumping on the Ruby platform for their brew FOSS software repo for the Mac, designed to do much of what the Debian/RedHat (apt-get/yum) systems do for Linux. And so Ruby looks a lot like a fad being upgraded to trend, but it is to early to know. Python, on the other hand... -------------------------------------------------------------------------------- ## Fri May 27 10:20:49 EDT 2016 ### Mutually Exclusive Networking Issue Okay, the reports are running, and the titles are being fetched. It's already 10:20 AM, but when they finally are run, they should be mostly without problems. At least I can use the Foreign Data Wrappers of PostgreSQL to TEST my CSV files. It's not precisely the same as pulling them into Tableau, but if there are problems in the CSV files, that will shake them out by my sum(1) select statement trick, that counts every row in the tables. That was a good move. It will help me proactively catch problems. The Hands-on Imperative. Okay, finish out yesterday's key point: get hitting Amazon Redshift with my login credentials from a machine not on the internal network. Fri May 27 13:09:43 EDT 2016 Okay, I am definitely in an A or B situation. Having difficulty getting A and B. have to let Marat know. We leave early today, and the Triweekly is unlikely to occur in its scheduled time. Okay, let him know by email and got a meeting on the calendar for next week with Branden. -------------------------------------------------------------------------------- ## Thu May 26 12:43:13 EDT 2016 ### Battling Murphy's Law Okay, a bug just struck in the reports, but it only hit one of the tables... odd. One table for one property. Thu May 26 15:51:46 EDT 2016 Struck by meetings and discussions. Still must not forget to give one particular person an update. Hmmm. - Email to Report-audience Stakeholders re: holiday / coordinate with Marat - Email to Jamie to follow-up from this morning... investigate! Okay, emails sent. Files updated. Stress endured. Think. Let stuff bubble up. What's important? The important stuff'll come back around. What's the important stuff? Oh yeah! Is Amazon Redshift reachable with my login credentials while NOT on the internal network. I have a barebones file that will let me test. I need to externalize the credentials, though... so that's my next step. Ugh! psycopg2 on the Mac... again. Anyway, I can test this from home now, and so I shall. Later. -------------------------------------------------------------------------------- ## Thu May 26 11:19:00 EDT 2016 ### TWO roads diverged once again... Okay, on the 11th floor, hanging out at the couches. This is going to be interesting. Make myself a not-unexpected presence here, which is a bit interesting, because I am so much like the odd-man-out. I am a developer, but not a developer. I type fast, understand their issues deeply to the implementation level. I know an application-level cache from a varnish cache from an Akamai network boundary cache, and the issues of dynamic versus truly static cache-able resources. And I keep my personal daily work journal in vim. I advocate the full-screen nix-like terminal interface. Ha ha ha, I just heard "It must be Akamai" mentioned in background conversation. Just sit and absorb for a little while. Don't look like you're (I'm) projecting my issues as the only-important and currently hot one. First, become one with them. Know what is hot for them right now. Two paths... you know what? I need that poem here in this journal, I refer back to it so much. Over time, all my favorite re-publishable poetry, and my own poetry, should appear here. TWO roads diverged in a yellow wood, And sorry I could not travel both And be one traveler, long I stood And looked down one as far as I could To where it bent in the undergrowth; Then took the other, as just as fair, And having perhaps the better claim, Because it was grassy and wanted wear; Though as for that the passing there Had worn them really about the same, And both that morning equally lay In leaves no step had trodden black. Oh, I kept the first for another day! Yet knowing how way leads on to way, I doubted if I should ever come back. I shall be telling this with a sigh Somewhere ages and ages hence: Two roads diverged in a wood, and I— I took the one less traveled by, And that has made all the difference. Yep. Okay, so think about issues of setting up a cloud server of your own for PostgreSQL, and doing your WHOLE SCRIPT there. Why not? Well, the first question is whether my Amazon Redshift credentials can get me logged-in OFF of the ZD internal network. Again, it's a question of having done my research first, so that I can even articulate my question correctly. Am I really in an ((A but not B) or (B but not A)) situation? Go check my Amazon connection string values and just see if I can't infer it from there (no, of course I'm not going to post them here -- you'd need a SARs form filled-in properly to see them), and we'd give you your own. Accountability, and all that. Okay, ah ha! I could just turn to the side to ask a few questions, but I will resist. I want to answer a few things for myself. I need to understand the network "shape of things" without having to be explicitly told, and that's a reasonable capability for me to have. It will serve me well long-term. Interestingly, I misplaced my Kindle Paperwhite at home last night. It will show up, but in the meanwhile, I feel a but cut off from my books. Even though I have access to them on my phone, the screen is too small (and broken), and projects the wrong image, so I'm installing the not-highly-rated Kindle app from the Mac App Store. After my abysmal experience with the Kindle app from the Windows 10 Microsoft Store, this will be interesting. It supports full-screen... that's good. Oh, it will only show my actual Kindle purchases, and not all my "documents", which are my O'Reilly books extracted from the $5@ iPhone-priced versions that I purchased and uploaded to Amazon. This is all besides the point. What are the usual suspects of... of what? Tracert, of course. This is just a nslookup task. I don't need a DNZ zone dump or anything. This is just an nslookup against the URL address I use to connect to Redshift... BAM! Public IP. Okay, that's a really good start. I don't think I'm even going to need VPN to connect to Redshift from the outside. This can be a completely standard tcp/ip communication task over the generic Internet, no private network access required. I have to do a test to get a success assured moment. 1, 2, 3... Oh, didn't I have a test file floating around for bare-minimum Redshift connectivity testing? Yes, yes I did. It's text.py on my local virtual machine... ugh! Okay, gotta go make that part of the main repo (which is a private github repo AND doesn't have the login credentials embedded into it)... and so go make that part of the repo, then come back to the 11th floor. -------------------------------------------------------------------------------- ## Thu May 26 11:08:54 EDT 2016 ### Solving a Mutual Exclusivity Networking Problem Okay, I got that very important documenting the beginning of the next phase of my life out of the way, and now for a really good work-product of a day. This is going to be cool. I'm poking around my virtual server options a bit, but I have to chat with a sterling gentleman before I commit down a path. I can have A but not B in one scenario, and B but not A in another scenario. I want to see if I can have A and B in one. For my discussion with him, make sure I have my terminology down regarding this type of networking Bonjour made popular with the .local at the end. It's... zeroconf, or Zero Configuration. The concept here is the Apple Bonjour tricks of machine and service discovery have pretty much been embraced by the rest of the world to give everything a .local address. Okay, you are equipped. Now go talk to the Tech team. -------------------------------------------------------------------------------- ## Thu May 26 10:15:56 EDT 2016 ### Witness The Nested Miracles / Re-Defining Legacy Computing That you are reading this is a remarkable, and highly unlikely event. I'm basically a marketing drone nobody (in Commodore engineering terms) with the young pup upstart hopelessly naive enough to think that I can still make a difference. This Levinux is a nifty little project to shove in their face that I can. Half-baked hacker solutions are sometimes the best. When mediocrity marches in to fill the ranks of clock-punching wage earners who demand to get in on the gravy train that you just pioneered, it's time to jump ship and lead a new project that's maybe 5 years ahead of its time that no one will get at it plays out in real-time, looking and feeling a lot like performance art. But years later, some will look back and go Ohhhh that's what he was getting at with all that qemu + tinycore = minimal VM, and maybe a neuron. You'll say he saw it and was trying to demonstrate it by cobbling together the almost comical bits in today's information landscape. I don't just want to ring you a book. I want to bring you both a linear and a nonlinear experience. I want to create and control the media that it plays back on. I want to use simple, generic components and techniques that others themselves can easily figure out by looking at, and use then for themselves in whatever way they wish. I plan to not interrupt my daily work flow by focusing on this project, but rather to turn my daily work flow INTO this project. I am solving, as one of my old life-instructors, Jeff Porter, used to put it, solving simultaneous equations. How does one enjoy what they do, because a hot commodity in the job marketplace so that you can always earn, have a family and raise a kid, and have the satisfaction of being amongst the best in the world at what you do, and still have plenty of time to really be with and help and enjoy your kid growing up, at the ages when it matters most? I couldn't. But I won't complain. I'll make the best of it, applying the same pragmatism that drew me to Linux and Python. Give Adi 70% of what she needs from her interactions with Dad with the 30% of the week I have with her, and make up the rest with the SciFi telepresence projects you hear about through movies like Interstellar and books like Diamond Age. Adi, I am your ghost, or I am the Jackal King, because why the F-not? Making my communication with my daughter Adi my secondary purpose of this How to Be Technical Book of Levinux Tiny Virtual Server project is ideal. I am in a happy place in my head as this project gets underway. With thoughts of Grace Hopper, Ada Byron Lovelace That will add purpose to the order of the projects I undertake in this book; telepresence tech at the top of the list. Maybe I can turn it into a sort of head-start gift in the world, ensuring she has a sort of literacy that far exceeds the average person of her day. That means really making Python like a native spoken language for her. I need to really let her start seeing the value of Pyhtonically manipulating lists ASAP. Or maybe just Scratch. My future decisions in this journal will tell. We are as early in the true computer revolution as probably Charles Babbage. Or maybe something like the PDP-1 from Digital Research, and one of the first moves off of vacuum tubes and onto transistors, making computers accessible beyond the elite priesthood of mainframe operators on million-dollar equipment. Now, here you are looking at my code running on... well, take a deep breath. Let's see. You're reading this in a web browser, for simplicity's sake, we'll trace our stack up from the little black server window thingie that I call Levinux that's serving this Web page to your browser. Okay? This is possible for a variety of unlikely reasons. The first of these little technology miracles is the QEMU free and open source PC emulator, which is the actual software you see running on the desktop as the little black box, that will probably capture your pointer of you click it. But to enjoy the miracle of highly compatible, time-tested qemu binaries, that have been distributed over the years to great fanfare by virtue that they remain hosted, linked-to and not de-linked one of the thousand reasons that could have gotten them de-linked. This is mostly the case of the Mac binary, based on the popular Q fork of qemu, and the Windows binary, which I'm pretty sure is one of Stefan Weil's at https://qemu.weilnetz.de/, but I'll have to check which. Interestingly, the Linux one is just pulled from the Ubuntu repo, made-to-order the way I'm trying to go with the other versions, with pointer-capture turned off. But there's a lot of learnings and work between where we are today, and me being able to compile my own QEMU binaries for the optimized desirable traits for Levinux across all host platforms (variations on Windows, Mac OS X and GNOME/KDE/Unity). I get most Linux desktops for free, because I successfully get it to run just in the native command-line terminal. So, old time-tested qemu binaries for each platform are part of a strange cocktail of files wrapped up, disguised as a Macintosh App on the desktop of a Mac. On Linux and Windows, this thing that looks like an App on a Mac looks like a folder named Levinux.app on a Windows or Linux desktop. But the entire magic of Levinux is all mixed into the location: levinux/Levinux.app/Contents/MacOS ...relative to the directory you created with git when you cloned https://github.com/miklevin/levinux In that folder, you will find such curious goodies as qemu.exe -- the binary that's indirectly invoked when you double-click the WindowsLevinux.vbs icon at: levinux/WindowsLevinux.vbs Windows had an interesting obstacle to overcome in that how everything invoked from the command-like "locks" the command-line open, if it itself is intended to have output. So, gone are the days of "breaking off" a user interface from a program invoked from a .bat file, expecting to eliminate the original Windows Command console, CMD, or whatever the heck they call it, for many security reasons that have to do with preventing .bat files from running software silently in the background. For that, you have to trick someone into clicking a .vbs file, which actually still does (and hopefully, will remain) have the ability to spawn an invisible and unattached child task -- namely, qemu32.bat located in that MacOS location I referred to above. qemu32.bat has inside of it, the magical invocation that does much of the magical dot-connecting of the Levinux project -- a blending of the strange and odd boot codes and execution context options of both QEMU and Tiny Core Linux. Each host operating system has its equivalent of qemu32.bat, which are as follows: | Path to boot script file to execute to run Levinux | Host OS | | --------------------------------------------------- | -------- | | levinux/Levinux.app/Contents/MacOS/qemu32.bat | Windows | | levinux/Levinux.app/Contents/Resources/qemuonmac.sh | Mac OS X | | levinux/LinuxLevinux.sh | Linux | At this point, one might thing that maybe Levinux is vanity-ware, and a tribute to myself... and in the voice of Phineas of Phineas and Ferb, "Yes.. yes, it is." However, you will also notice that LinuxLevnux pretty much contains my no longer with us mother's name embedded into it (Lynne Levin), so this is also something of a tribute to her. I'll have to work my dad's name, Chick, into it. Then maybe I can use some of their old friends' favorite joke, Chick 'N Lynne. And I'll call it Legacy software, appropriating an old term that will eventually go out of date, to be replaced by a new concept of Legacy in computing... the trans-generational system stuff, which is one of the brass rings I'll be grabbing for with the Levinux project (among many others). ------------------------------------------------------------------------------- ## Thu May 26 09:31:09 EDT 2016 ### Going to Teach Adi Algebra and Calculating Percentages ASAP Re-initiating a terminal session on a full-screen window for my latest journal is just now a morning ritual on whatever computer I sit down at. This relies upon the fact that I did a git commit and push on the last journal entry I made from wherever I was. If I didn't, then the new work I do today will get out ahead of some work that exists somewhere else on one of my various laptops, desktops, virtual machines, or whatever out there somewhere, which I will likely encounter again someday and realize the mistake. This is why we git pull every morning before we begin to work. Git pulling, even before committing your local work is fine, and even desirable if all you're working on is a personal journal. That's because the code doesn't have to keep running, and the continuity isn't like all that critical -- or at least is only as critical as your subjective judgement. And so, before you do a git pull, you can just copy your current version elsewhere temporarily, then do a full git pull, forced, let the auto-merge occur, and if it didn't go right, you can always go back to the original, which you copied outside the git system. I find such workflow easier than mastering git's many powerful options, which I'm totally sure could have gracefully dealt with this scenario. That's one of the personal deficiencies I will address in this project. Other personal deficiencies I will be addressing is not giving my daughter enough of my time in this post-separation arrangement, where she's spending most of her time elsewhere. I have her for weekends, and so I'm going to have to one-better the 80/20-rule... plus. My daughter deserves better than 80/20. I have her for 2 our of 7 days a week (baseline), so that's 2/7 or close to x/100th? Solve for x. Well, it's tricky, because 7 doesn't go into 100 cleanly. But we do know that seven times ten is seventy (7 x 10 = 70). So, it's something more than 10 times. This will be important, because the number we git is what we will multiply two by ten (2 x 20) by to get our percentage of the week that the weekend is. But how much more than 10 times? This is where the quick head-math gets tricky, and it's valuable to have a visual framework for the work. In showing you how to calculate this, I will also be venturing into some table-construction of markdown. If you don't view this as JavaScript-altered HTML and as just the "view-source" markdown in which I wrote it, you will see table structure at about this point. Okay, first a reminder from: https://github.com/adam-p/markdown-here/wiki/Markdown-Cheatsheet#tables Colons can be used to align columns. | Tables | Are | Cool | | ------------- |:-------------:| -----:| | col 3 is | right-aligned | $1600 | | col 2 is | centered | $12 | | zebra stripes | are neat | $1 | There must be at least 3 dashes separating each header cell. The outer pipes (|) are optional, and you don't need to make the raw Markdown line up prettily. You can also use inline Markdown. Markdown | Less | Pretty --- | --- | --- *Still* | `renders` | **nicely** 1 | 2 | 3 | Original | Percentage | | -------- | ----------- | | 2 days | ? | | -------- | ----------- | | 7 days | 100 percent | So, to jump right to the answer, it's (2 * 100) / 70 So, first thing to realize is that the asterisk (*) is the same as using a little x for multiplied-by in normal algebraic math. The slash (/) is the same as using the divided-by symbol (÷) in algebra. The parenthesis group order-of-operation, and so the calculation is really also: 200 / 7 = ? So, how many times does 7 go into 200? Ten? No, we know that 7 times 10 is seventy, because we just add a zero. And 7 times 20 we know must be 140, because 7 x 2 is 14. Or 7 + 7 (which is the same as seven times two) is fourteen. So we know that 7 must go into 200 more than twenty (20) times. But how much more? If it were 30 times more, then we'd add another 70 to 140. That's the same as 7 x 3 x 10. Seven times three is a tricky one, you have to count up-to or memorize, because 7 is a high-ish prime number, and remembering its intervals is tricky. Just add 7 to 14 for a shortcut. Count up... + 1 = 15 + 2 = 16 + 3 = 17 + 4 = 18 + 5 = 19 + 6 = 20 + 7 = 21 So, 7 x 3 is 21. Add a zero to 21 (for x 10) and you've got 210. Well, 210 is greater than 200, but not by enough. So we know that 7 goes into 200 just under thirty (30) times. And so I have Adi for just under 30% of the week. Let's say 28%. If you cheat, and plug 2x100÷7 into a calculator (re-arranged to abide by order-of-operation for the calculator), we get 28.5714285714... and the decimal remainder probably goes on for quite awhile. The odd nature of the result reaffirms for us that getting a ballpark estimation was a bit harder than it probably should have been. This strange fact comes down to the fact that we were born with 10 fingers, 7 is a prime number, and our week happens to be 7 days long. This coming together of a series of parameters to calculate a number that humans are familiar with as an API (understanding quantities on a scale of zero-to-one hundred) is a good exercise to understand the real-world utility of doing math quickly in your head. Some things come down to process automation. But others come down to applying abstract formulas that you know work, because it has been proven time and again to be so. And so now we know that on a scale of one-to-a hundred, I spend about 28.6% (twenty eight point six percent) of the week with you, somewhat extended, because I usually see you on Friday night, and usually cut short because I drop you off on Sunday night. The grandparents are offering a way to see you Sunday morning as well, and that is indeed interesting. I need to work my telepresence projects into all of this now. This is the raw initial draft of an important chapter in my book, teaching Adi Algebra, and calculating percentages. Commit and push this entry, and get your ideas from the morning subway ride in here too! Important stuff. Also want to think about the addition, subtraction, multiplication and division education that comes well before algebra. But these abstract notions are going to be a breeze for Adi's impressively developing mind. I have to make sure to turn her onto this stuff, and not off. Wanna/need to/gotta make sure she has those very satisfying "ah-ha!" moments, and internalize the occasional pleasure of math. -------------------------------------------------------------------------------- ## Wed May 25 22:25:34 EDT 2016 ### Next: Don't Capture The Pointer? And Adi. Okay, it's almost 10:30 PM. Billy's in his extremely cuddly mood. Not easy to ignore him and type. Is that what I am doing with my family too? Ignoring them, so I can type? Why aren't I organizing? Why aren't I using this time at night to do all those things I need to do? Why is tomorrow Thursday already, and I've only done half the things I intended to do this week? And why do I still feel so good? Yeah, it is because I am expressing myself, and I will work all that other stuff out. It may be a wee bit more painful to resolve everything than would it have been were I on top of everything. But then, were I on top of everything, I couldn't be correcting my career trajectory as well as I have been doing, lately. A culmination of everything I've been working on besides Adi, which is everything that I am besides Adi. It is a great deal of what I will have to pass onto Adi... who I am, what I'm about, and what I'm capable of, and how I saw something coming, and I jumped on it. How I was the first one to pull off this particular viral memetic information age harmless (it's certainly not damaging to anyone, though I may tease) artistic hack. I'm just wiring together a few cool but common pieces, and throwing in a few custom scripts, and performing a little SEO. I want Adi to see that I saw something, and I did something about it, and I gave it a really good go. Damn, do I need to learn how to compile minimal -- maybe even old versions of -- qemu, baking in static dependencies, and getting it to somehow not capture the pointer. This is mental conditioning, that's what this is. It's part of what I need to keep myself anchored and grounded. Are those good things? Maybe it's to keep my ship stable and seaworthy in a storm... yeah... something like that, for sure. Now, let's flash some signal-lights between ships. Oh, and to paste my thoughts from the subway ride home: # Alice and Bob Open Shop Hello y'all Levinux tire kickers. Welcome to where you can "sudo rm -rf /" with confidence and certain smugness over your friends who wear that T-shirt but are virgins to the experience. Let's pop your Information Age cherry by actually performing that absolute unrecoverable self destruction wreck of your system. What you do is open Terminal on your OS X Mac and... oh, you're on a PC? Well, there's PuTTY... still too complicated? Sheesh, okay, you hit 3 to login to Console. First, you've got to click with your mouse actually on the Levinux window to send the 3 and Enter keystrokes to it. It's just like a web browser or other Windows or Mac program in that regard, and so it should not be an insurmountable obstacle. Consider this step the intelligence test portion of the introduction to text-and-terminal-based Unix, Linux and other *nix-like operating systems that are the new plumbing of the information age, with a free and open. source twist. Okay, now type the following on the line that Not only do I believe that the Information Age is already behind us, but it is already stratifying wealth and control of the world's still-finite resources. Tiny Core Linux is made to boot fast, get network services and do some cool stuff with a repository system much like Ubuntu's, but cooler, smaller, and most definitely faster. No GUI, please. Desktops are fun little safety nets. Real Age of Communication Samurai carry some cool little lightweight doohicky gadgets, along with our bloated intellisesning wipe your IDE for you ol' favorite, Eclipse. No, I kid you -- Microsoft holds a serious stake in this game, and is no more to be written off than you would Apple or Google. Microsoft is adding the genuine Ubuntu Bash shell, and presumably sizable Miceosoft-Ubuntu-Debian repo. Who'd have thought the non-POSIX-compliant one would be the one to win the standards war, because the Debian packaging system resolves dependencies something fierce -- presumably over Redhat's rpg/yum system. And so we begin our technical education by exploring the nuances of the schisms that exist in tech. Alice is an Admin. The A represents her first initial, place in alphabetical order, and her Authority. Alice is Always on. Anytime, day or night, Alice will take Action that she is Authenticated and Authorized to perform. Alice had Answers. Then there's Bob. Bob is a Businessman who doesn't want to be Bothered. Bob's business is basic. He doesn't even want to be online, but Bob has questions he needs answered to better run his business. So when Bob needs Answers, where do you think he goes? Alice represents not just a coder, programmer or even hacker. Alice knows how to play the communication game. She's casting off memes into the communicationscape to go self-guide their ways to the Bob's of the world -- mostly to advertise her services. -------------------------------------------------------------------------------- ## Wed May 25 21:46:31 EDT 2016 ### Merge and Rant Okay, going to do that git merge. Attempting a git pull before even doing a commit of my local changes to see what happens. Results, and subsequent commit and merge shown here: Medium-iMac:miklevin.github.io miklevin$ git pull remote: Counting objects: 21, done. remote: Compressing objects: 100% (19/19), done. remote: Total 21 (delta 14), reused 9 (delta 2), pack-reused 0 Unpacking objects: 100% (21/21), done. From github.com:miklevin/miklevin.github.io 1e0faac..92a4545 master -> origin/master Updating 1e0faac..92a4545 error: Your local changes to the following files would be overwritten by merge: index.html Please, commit your changes or stash them before you can merge. Aborting Medium-iMac:miklevin.github.io miklevin$ git commit -am "Attempting merge" [master a8175a8] Attempting merge 1 file changed, 20 insertions(+) Medium-iMac:miklevin.github.io miklevin$ git pull Auto-merging index.html CONFLICT (content): Merge conflict in index.html Automatic merge failed; fix conflicts and then commit the result. Medium-iMac:miklevin.github.io miklevin$ Okay, git is really awesome when you're working by yourself. Okay, so I get to learn how to read this lovely merge conflict syntax. Clearly, it wreaks havoc on my vim markdown color coding, although you could not possibly appreciate that reading this wherever you are. Hmmm, how do I show you these codes? Oh, merely by indenting them more than 4 spaces of course! <<<<<<< HEAD This was inserted just above the datestamp in the post chronologically before this one, which I recognize as last night's writing regarding Levinux language, on the Learn More selection from the rc.local menu. Okay, what do I really want this creation to be. This is the audio-visual aid -- the live-example -- the CD-ROM that comes with the book, such as it were, for the book that I'm writing. Only however, the only way you are going to experience the book is through this writing, which I publish here. I have a public journal where this has all at one time or another been in the public eye already, but in the very, very, very, very, very fringes. Anyone reading this on mikelevinseo.com in real-time as I write this is absolutely as insane as I am for writing it. We are those who think out loud in vim, and are perfectly comfortable keeping an online blog in precisely one well marked-up textfile. Or should I say marked down? Well, it's mostly markdown, at any rate, with my hard-returns set at 80 columns, so that I train myself to work in 80-column mode. It is optimized for the human eye. It is not tiny. It is just right for reading, on a comfortably appliance-sized monitor. Even just 24-inches is getting to be a little too big. Coding is a personal matter. Intimate screen, like writing a book, that you have near thought-to-text mind-control over what's on the screen -- okay, perhaps not as much as an emacs user, but still really quite timelessly invaluable. Getting into the zone really does mean staying away from the mouse for menial tasks. The zone, the flow... whatever you call it, I am a big believer in the artistic-inspiration aspect of problem-solving in code. And I don't believe in pulling my punches on the geek-out that I'm doing that I actually have some sort of novel audience finding me through a novel means. Everything about this project is quite novel, starting with challenging you to recursively delete everything from system root as the with superuser elevated privileges. It's a pretty good joke in its own right, but the joke gets even funnier when you consider we're joking right back at the people who use that joke by doing precisely that... on Tiny Core Linux. I think this is where the disconnect comes from between the people sayyyyy... who hang out a lot on the Tiny Core Linux forum, turning people off to it by shooting people down with truly vintage old school draconian sysadmin condescension and the vast majority of the world who could actually benefit from a little exposure to such a wacky but good-to-learn-from distro, where you can "sudo -rf /" without fear. I'm finally doing this. Connecting dots that were obviously there, and just bringing the 3 different projects into one is just totally natural. I have three things I'm working on. The first is this nifty actually effectively jumbled mix of binaries, data files, loopback files, and drivers, varying per host OS (Mac package/bundles mixed with Windows .exe files and Linux elf files that actually work together to pull off this trick. -------------------------------------------------------------------------------- ## Tue May 24 21:49:05 EDT 2016 ### Planning a Levinux Learn More screen If you reached this far, you are a curious individual, and we have undeniably made some sort of connection. With 7 billion people in this world, you have somehow managed to connect with me right now, over all of them. So, why not connect with me moving forward? You be in my tribe. I'll be in your tribe, whatever. Our tribe has not fully materialized yet, nor might it ever, as the nature of this thing is something to move-on from quickly, once I got you over the nixy hump. Unix is easy. Unix is hard. Both are right, but in either case Unix, and the nix-like OSes under that umbrella, namely Linux, are above all, worth it. As of very recently, the free and open source software community has officially won. The rest of this decade is just tallying the booty and the body-count of the transition, so well documented in The Cathedral and the Bazaar. So, let's hook-up. facebook.com/miklevin is as good a place as any to start. The world is imperfect. -------------------------------------------------------------------------------- ## Wed May 25 16:58:27 EDT 2016 ### Wable Upped Their Prices (And I Got it Back Down) Whoah, Wable is upping its rates on me, starting June 1st. Time to reconsider my cloud-server provider -- and whether I even want to go with an actual cloud provider, and not a home-spun personal datacenter, on OpenStack, or something. Glad I logged in to check out the rates. They're billing on resource consumption, instead of a fixed-price per CPU... and so... let me watch their video about how to resize to save money. If I don't like, I find another way. Maybe I keep one main instance of Pipulate up, but it certainly makes me reconsider my personal hosting strategy. There goes the $1/mo/server dream... for now. I cut it down from 5 servers to only 3 minimal ones, which brought me from a forecasted $30/mo to $13/mo. Plan through next steps. Even of those 3 remaining servers, I have to free at least one up for the coming project. It just makes sense for me to use a Wable one... okay, so I don't need 4 (total) Pipulate servers right now. Keep prod.pipulate.com and newyork.pipulate.com. Get rid of seatle and dallas. Remove those entries from the Rackspace loadbalancer... don't just disable them... delete them... servers and all... okay, done. One very small Wable server: Cost: ~$4.32/mo. CPU: 1 RAM: 1GB Disk: 10GB Okay, now I'm spending less than I was. That 5 CPUs for $8/mo was too good to last... at this moment in time. It will come back, but in the meanwhile, I am going to focus on just keeping 2 servers: one for personal, and one for work. Okay, it's 5:30 PM, and my brain is quite frazzled from today. The second running of the data files is occurring right now... and it's done and tested. I expanded the table import check to show WHICH sites passed/failed the import. All in my private core repo on github. That's enough for today. Maybe try to get my Levinux symlink in before I head out today. -------------------------------------------------------------------------------- ## Wed May 25 12:39:25 EDT 2016 ### Still Just Murphy's Law, and NOT a Rabbit Hole I really enjoy talking with my co-workers about Python and my projects. But now is time to focus like a laser beam on these SQL errors that keep all the data sources from being cleanly importable into Tableau. Let's make this very systematic. Figure out out to redirect output of psql to help make detailed checklists. Ah ha! Wow, I know a lot more Unix than I know... psql < testtables.sql > debugme.txt 2>&1 How did I learn so friggn' much? Oh yeah, the Amiga then Levinux. > NIL, startup-sequence and all that AmigaOS stuff. Wow, was that Unix-like. Okay, you can't afford to wait for a re-run of the entire set of data files. Zero in on one property that portrays all the problems. Okay, these problems are going to be perpetual if I don't take some proactive filtering or neutralizing options. It looks like I'm getting another varchar(1024) is too little space error, but it's related to the double-quotes again. I think I want to replace ALL vertical pipes in title-tags with upper-case I's and all double-quotes with... uh... maybe nothing. Maybe just chop out double-quotes. What's the real harm for a report like this? Looks like this would require a re-running of all the scripts... no... yes. Wed May 25 16:33:35 EDT 2016 I got blindsided by the Tableau server not being able to reach my local machine. I should have figured, but per the Tableau expert here, he's 99% sure internal addresses are not reachable by the Tableau server. I could start pushing for NAT addresses and such, or I can make part of the transitional plan, getting this thing the frig' off my desktop virtual machine, and onto a cloud server somewhere. THAT would be flexing my muscle, while we put the case in for the tech team to take over the process. But I should elevate myself to a "truly no hurry" stature... because I've got it running on one of my cloud servers I've got sitting around for just such a purpose. Thank you, Wable. -------------------------------------------------------------------------------- ## Wed May 25 09:13:14 EDT 2016 ### Installing Tiny Core Linux Extensions Without Their Dependencies I seem to recall doing a some journal entries last night, but I suppose I didn't push. Oh well, I will have some merge conflicts to resolve once I'm home. I should look at the best way out there to resolve merge conflicts. I should just learn to read the messages in the text file better, and resolve the conflicts by-hand. It's not so critical in a journal as it is in program code, as it will not affect how anything runs. The scheduled cron job did not run overnight. I am running the script now, and will do a number of maintenance and stability measures today, including testing it under cron hourly and looking at other scheduling mechanisms. I should also give the Tech group heads-up that this scheduled task will be coming their way. I can feel myself in quite a bit of transition right now. I'm in sort of a platform-limbo. While I'm enjoying Mac OS X's very Amiga-reminiscent handing of virtual screens, I am also very much on Windows 7, Windows 10, Kubuntu and Tiny Core Linux right now. Wow, is that friggin' cool. While the reports are running, I want to look at a few questions I have about Tiny Core. It's time that I start groking some of its inner workings, particularly surrounding how it maps-in whole directory structures through the tcz extension system. Remember: tce is the folder where .tcz files go. Okay, it's time to just say a bunch of stuff about the way Tiny Core Linux works, along with how Levinux is leveraging those features. It's compiletc.tcz and not build-essential as on Debian and other distros. So, if I do anything that actually requires compiling dependencies (numpy?) or building extensions for Tiny Core from within Tiny Core, it's compiletc.tcz that I want -- which happens also to be a meta-extension, meaning it probably just installs a bunch of other extensions and their dependencies. Building your own extension seems to be pretty straight forward and elegant in Tiny Core, specifically, the way it maps the contents of an extension into (over-top?) file hierarchy space. In other words, you set up a directory structure (usually in /tmp) and put your files there. This usually mirrors /usr/local/bin and such. This is how the "abuse of" /usr/local/bin for installs is dealt-with by Tiny Core. There's a bit of pruning that's required in those locations, such as removal of man pages. A custom version of squashfs (that sets proper defaults) is then used to archive up the directory tree into a .tcz file. There's also optional Extension install scripts that live in /usr/local/tce.installed which do any post-install clean-up work, or make sure any symlinks are created that the extension is going to rely upon. Ownership and permissions on these files is important (root:staff, 755) Okay, so what's a .tcz Tiny Core Linux extension file? Let's break it down: - It is squashfs, which is a compressed read-only file-format (a simple zip) - It is loop-mounted, which is a file that's mounted/accessible like a drive - It is something which even after all that may need a custom script executed to perform post-install/pre-running cleanup/setup - This usually means the use of symlinks to the main file system to "put" things into their expected directory tree path locations There are two methods (at least) what can be done with the contents of a tcz file. If kept in its "original" loopback location, symlinks are required to make these locations LOOK like native file system directory tree locations. However, why use symlinks when you can just COPY the actual files into those locations for execution-speed? The answer is memory-usage. Keeping extensions in their loopback locations may make the data there slower to access, but it's more gentle on RAM, and STILL has no writeback corruption danger, because tcz files are read-only. I wonder what happens with attempts to write back to the symlinked locations? I guess I'll find out at some point. Okay, paths are really not messed around with in the creation of a .tcz file. All the directory locations inside a .tcz tree are the same exact ones where a "make install DESTDIR=... This is a "use the defaults first" approach that keeps things working as intended. The %PATH environment variable for Tiny Core Linux is: /home/tc/.local/bin:/usr/local/sbin:/usr/local/bin:/apps/bin:/usr/sbin:/usr/bin:/sbin:/bin:/mnt/sdc1/tce/ondemand Okay, and a look in /usr/bin shows me these tc-related commands: -rwxr-xr-x 1 root root 3421 Feb 23 15:43 tc-terminal-server -rwxr-xr-x 1 root root 2316 Feb 23 15:43 tce lrwxrwxrwx 1 root root 3 Feb 23 11:53 tce-ab -> tce -rwxr-xr-x 1 root root 6891 Feb 23 15:43 tce-audit -rwxr-xr-x 1 root root 323 Feb 23 15:43 tce-fetch.sh -rwxr-xr-x 1 root root 8297 Feb 23 15:43 tce-load -rwxr-xr-x 1 root root 1161 Feb 23 15:43 tce-remove -rwxrwxr-x 1 tc staff 620 Feb 23 15:43 tce-run -rwxr-xr-x 1 root root 2347 Feb 23 15:43 tce-setdrive -rwxr-xr-x 1 root root 4616 Feb 23 15:43 tce-setup -rwxr-xr-x 1 root root 2139 Feb 23 15:43 tce-size -rwxrwxr-x 1 root staff 1107 Feb 23 15:43 tce-status -rwxrwxr-x 1 root staff 7511 Feb 23 15:43 tce-update -rwxrwxr-x 1 root staff 464 Feb 23 15:43 tcemirror.sh Maybe what I want is to NOT go through tce-load. Perhaps tce-fetch is what I want to avoid dependencies. BAM! I'm correct: - http://forum.tinycorelinux.net/index.php?topic=6702.0 And doing a fetch without getting extension dependencies: tc@box:/mnt/sdc1/tce/ondemand$ tce-fetch.sh python3.tcz tc@box:/mnt/sdc1/tce/ondemand$ ls python3.tcz tc@box:/mnt/sdc1/tce/ondemand$ Woot! Of course, there's the two-fold issues remaining: - These install into /opt/tce/optional and not /opt/tce/ondemand, like the other extensions installed with tce-load -wi. - The edit to onboot.lst Anyway, that link I give has this wonderful example code for dealing with all of that: mkdir -p /tmp/transmission cd /tmp/transmission tce-fetch.sh transmission.tcz tce-fetch.sh curl.tcz tce-fetch.sh curl.tcz.dep tce-fetch.sh openssl-0.9.8.tcz tce-fetch.sh libevent.tcz echo curl.tcz > transmission.tcz.dep echo libevent.tcz >> transmission.tcz.dep tce-load -i transmission.tcz cp * `cat /opt/.tce_dir`optional/ cd `cat /opt/.tce_dir` echo transmission.tcz >> onboot.lst Wed May 25 11:35:01 EDT 2016 Okay, Murphy's Law strikes again. Every foreign data source (CSV file) seems to have its own little issues. I'm selecting sum(1) for every table to see what errors I can generate, and there are plenty -- seemingly a third. Time to cut this journal entry, and focus on knocking out every one of those errors. -------------------------------------------------------------------------------- ## Tue May 24 16:51:24 EDT 2016 ### First Round of Automating The Reports Okay, it's time to archive up a directory of files in Python, and move them to a backup location. Maybe I don't even zip them at first, but rather just copy all files from that directory into another directory. Hmmm, let me think. What are the criteria of the project? I want the script to be able to run over-and-over during the day. Maybe I should go in baby-step passes that doesn't program me into a corner option-wise. What's the most pragmatic approach here? I can't really afford to mess around with elegance at this time. Get rudimentary functional daily scheduling in place, and then improve the process daily. 1, 2, 3... 1? Remind yourself of the cron daily location. Edit my /etc/crontab to make the /etc/cron.daily contents run at 3:00 AM... done. Next, make a script without an extension that runs my Python script and has the correct execution permissions, and uses full path names to executables and files. Interesting! I have to activate the virtualenv for cron. This involves using the exact path to the Python executable located in the virtualenv directory... okay, tested. I believe this thing will run hourly. I want to write a simple log file with job-begin and job-end data. Nahhhh... it's friggin' working. I could execute the script manually from cron.hourly. Don't tempt fate. Oh wait, the one thing that does truly need to happen is the shuttling of the files over to the dropbox location, where the files update from. Hmmm. Do that test to see if the data updates live. Yep, confirmed. Updating CSV file immediately updates feed. Wow. And finally, put in the code to copy the resulting files from temp location to Dropbox shared pulse folder. Do a test-run to see the copy work (but with processing shut of). And now set everything to run beginning-to-end. And go home. Check the dates of the files in Dropbox tonight, and then again tomorrow morning. -------------------------------------------------------------------------------- ## Tue May 24 09:53:00 EDT 2016 ### Successfully Connecting to PostgreSQL Foreign Data Wrappers tables from Tableau Now, I need to think this final stuff through with great precision. I need it all working by the time Marat comes in. Do I install the desktop demo version of Tableau? Probably. But first, look through that solution. Running the reports. Checking my Mac VMs. Both updated to latest. Good. But my Windows 10 VM is nearly at 20GB and every move I make inflates it. Keep that in mind. It's mostly there for Levinux demos. **Reports** - Getting Foreign Data Wrappers to list like any other tables for Tableau **Levinux** - Controlling Tiny Core Linux excessive dependencies (X-windows) in .dep files - sudo ln -s /usr/local/lib/python3.4/ /mnt/sdc1/python3.4 These are my priorities for today, with a huge emphasis on the reports. Moving Levinux ahead will be a reward for doing a good job on the reports. Okay so LOOK AT what the extension is doing that exposes tables in Tableau. If you can't do that, consider "importing" the FDW tables to be native. Just did this to get rid of the "Windows 10" label when swiping between VMWare Fusion screens: To disable it, edit your ~/Library/Preferences/VMware Fusion/preferences file and add: fusion.ui.fullscreen.nameBadge = "FALSE" Okay, installing the Tableau Desktop 9.3.1 trial-version. Ugh, an additional 922.8 MB crammed onto this machine. Definitely delete it later. Interesting ServerFault discussion here: http://serverfault.com/questions/601140/whats-the-difference-between-sudo-su-postgres-and-sudo-u-postgres It appears that "sudo su" is an anachronism, and the better use for running Postgres is: sudo -u postgres -i So, you're simply saying "do as user postgress..." The -i is a bit tricky. I'ts usually associated with sudo itself, but in this context, it probably means acquiring the shell-context (shell/env variables?) of the user, postgres. At least, that's my guess. It switches me over to user postgres but WITHOUT su privileges. I'm sure this is the right way. I have a few questions remaining, concerning running psql "naked" or with the extended context: psql vs... psql -U postgres -h localhost I need to just get a feel for things "inside" of psql more. When running it, there are a few basic help commands it tells you right away. \h for help with SQL commands \? for help with psql commands Using \? shows a whole world of \d "Informational" commands. I have to indoctrinate myself to this stuff, because there's no one indoctrinating me to postgres. And so... it appears that an upper-case "S" can be appended onto any \d command in order to extend the query to "system tables" as well. It's almost like putting a -a on the ls command to see hidden files. And so \dS is a great "beginning point" for discovery. I see all the objects in the system. There's a pg_catalog "schema" which houses all the system tables. It looks an awful lot like a Schema is a database, but I don't think so, because its not what goes onto the connection strings under the database field in order to connect. Okay, I want to get this extension installed: https://github.com/komamitsu/foreign_table_exposer ...and I can't just trigger off a make/make install, per the C-compiling instructions, but there is another way: http://pgxnclient.projects.pgfoundry.org/install.h ...which should be installable with a simple: pip install pgxnclient Okay, done. Now, I should be able to: sudo pgxnclient install foreign_table_exposer Wah wah (sad trombone)... bzzzz! Jeopardy wrong-answer buzzer! Okay... Makefile:12 /usr/lib/postgresql/9.5/lib/pgxs/src/makefiles/pgxs.mk: No such fie or directory Okay, a little more research yields that I might need some developer tools for Postgres, and so: apt-get install postgresql-server-dev-all And re-trying the pgxnclient install... okay, looked pretty good, but with one warning about ISO C90 forbids mixed declarations and code... may still have worked. Per the instructions, I edited postgresql.config to have: shared_preload_libraries = 'foreign_table_exposer' And success! All is good. Marat can connect with Tableau to my CSV files through Foreign Data Wrappers through Postgres, even though it required the foreign_table_exposer extension, which needed the pgxnclient installer package out of PyPi, which needed the postgres-server-dev-all package out of the Ubuntu repository... Pshwew! All I have to do now is move files around, set the job on a scheduler, and confirm that the data feeds update live when I swap the CSV files out (without a server restart or otherwise forced-refresh). If they don't update automatically, I just look for how to make it so. Good stopping point. Now, eat. Thu Aug 4 17:38:12 EDT 2016 Getting the daily pulse report expanded turned out to be more time-consuming than I thought. Always check what server you're ACTUALLY trying to connect to. Step back and review your basic assumptions... ALWAYS! Also, the other SERP-tracking job I had to set up today, we're a few credits short. I have the emails out to either down the active jobs (my SEO counterparts) or to up the service plan (my boss). Either way, I'll get the job churning tomorrow. Bummed I didn't realize it until today. And I still have the crawl job to do for the breadcrumb trails. Well, maybe I can get that all churning tonight, simply by looking up the titles against the big list. Get that going before you leave for home (right now), and then at least you can hit that with fresh fury tomorrow AND get the stuff for the enhanced pulse report tomorrow where I pull back additional data per keyword from Search Console. Hmmmm. Okay. Doable. Okay, the AWR ranking job is running (albeit in slow mode), and the crawl of the pages that I need for the breadcrumb trail project is happening, and hopefully it will be all ready for me to go tomorrow morning. Okay, that job is running beautifully. I can feel this being the next evolution of this SEO Kung Fu product. Wow, this is going to be great. Go home feeling good. -------------------------------------------------------------------------------- ## Mon May 23 13:36:02 EDT 2016 ### Creating Foreign Data Wrapper Definition Tables Time to make the Create Table statements... all that are necessary to make all the data source I need. Hmmm... maybe I should do them right over on the tmux panels to the right. Why not cut down on carpal tunel syndrome, and a few lest left+Ctrl-b's... hmmm, maybe this is why I'm having such a tough time giving emacs a fair try... I dread finger convolutions. 1, 2, 3... 1? Ensure I'm still querying. Hmmm. Okay, things to remember: Restarting the server /etc/init.d/postgresql restart Change current shell user into user postgres: sudo -u postgres -i And then just to run the command-line sql interface: psql From there, you can see all tables with: \d After that, I'm pretty close to executing sql from a file: \i FILE I have only one entry when I do \d. It should be 24. - 8 should be keyword tables - 16 should be URL tables I have already done one keywords table. That was the harder because of how many columns it has. Now, do the URLs tables. And Tableau doesn't list FDW tables from a database connection source... wah wah sad trombone sound effect. Will solve! -------------------------------------------------------------------------------- ## Mon May 23 10:39:39 EDT 2016 ### Supporting Effective Python I have to remember: source venv/bin/activate I think different color-coding is being used for mardkown here than on my Mac. That would make sense. This is a freshly built system. I almost forget whether I customize my vim color-coding (lately) -- particularly for Python. Oops, almost forgot to give by boss the Monday morning update. Go do that pronto! Okay, done. Almost falling in love with my weekly reports. I should really dump them into some sort of unified document -- maybe Google Docs? Just go do that now. Such little investment, such large potential payback. Chatted with boss a bit about supporting other teams with my work. Surely, a nice cleanup of my work is in the not-distant future. I need to get the use of command-line arguments (sys.argv) out of the picture. I think I want to get that book, Effective Python, after having re-listened to that particular episode yesterday on the way home from Staten Islan -- just by chance listening. I "shuffled" such as it were, and re-listened into Michael Kennedy's blogcast right where I started originally: Episode 25 -- a very important one! Okay, just bought the Kindle edition of Effective Python. It was about $18, but I'm very hopeful. Probably a better starting-point than Fluent Python, which I also bought as a result of Michael Kennedy's Talk Python to Me podcast, but is probably too advanced a starting point at this particular juncture. I need to advance from competent beginner to not-ridiculous intermediate. Okay, continue with this morning's work. Make sure you don't miss emails, with this VirtualBox VM creating excessive get-in-the-zone myopia... good idea, already 2 emails waiting. No harm in running Firefox on my second window. Hard to believe dual-monitor mode works so well under VirtualBox. -------------------------------------------------------------------------------- ## Mon May 23 10:28:55 EDT 2016 ### Oops, can't forget keys anymore! Ha ha, well this is the power of my approach, for sure. I locked my Mac in my drawer, where I usually edit my daily work journal, which I keep as a private repository on Github.com, so I switched it over to my PC Windows desktop for my first thinking through my morning thoughts. But I have to be in a Kubuntu VM all day for my current project, and so I didn't want the keyboard capture screen switching nightmare of VirtualBox full-screen mode -- no 3-finger swoosh of VMWare with a trackpad here... just checked on the Lenovo laptop native keyboard... confirmed, nope. And so, once in a fullscreen vbox vm, always in a fullscreen vbox vm, because the death-move keystroke combo actually required to switch screens isn't worth it. And so, I construct my same tmux "workspace" for journal-work that I construct for code-work, which keeps one panel 80-column wide full-screen to the left, and two smaller pannels split down the middle to the right. Something vaguely Amiga-reminiscent about this to me... can't remember what. Maybe tmux itself, who knows. I got familiar with vim in 1991 when it came out on FredFish, so why might I not already be familiar with tmux? Anywhoo, get those reports generating... -------------------------------------------------------------------------------- ## Mon, May 23, 2016 10:05:22 AM ### Don't Get Stressed! Do Great Work. Chip-away! Well, I forgot my keys going into the office this morning. And I forgot keys going to the Catskills this weekend, which turned out fine because we only ended up staying at her grandparent's place. But the fact I did this twice so close in a row shows me that I'm under stress, and really have to take inventory of things before I proceed to quickly and too blindly down certain paths. Think twice, then think a third time, then do (REALLY DO) those things you need to do that you deem necessary to move your life forward. Make some chisel-strikes, my man! Bank some wins! Be who you might become. Interestingly, having forgotten my keys means I'm doing my daily journaling here on my big main screen of my Lenovo laptop, and not my little Macbook Air that I'm so excited about using now that I have Windows 10, Kubuntu (and of course El Capitan) running simultaneously on it. Just as well, I would have been too tempted to make a video. As it is, switching journaling over to this hardware has been a big enough distraction this morning. And in fact, I'm going to want to live Kubuntu-side today doing that PostgreSQL work. Oh, speaking of which, let me dump my subway commute thoughts here, before going all VM... I am totally primed for an awesome new week. Finish out this project fast. Ending in sight. The occasional evenings to bang away on code will be nice. I have to teach Paige, our new Intern, SEO soon. She's working with Dimitry right now, but I've already started showing her my systems. How technical do you want to become at this job, Paige? I'm about to pull off some technical Kung Fu here. I'll run you through the project, from a technical requirements sense. And since you're also in the business end of it, and when it comes to the business-tech intersection, nothing could be more charged than choice of database, so I'll point out the occasional clash of the IT-overlords issues that come up there -- Microsoft vs. Oracle vs. FOSS. Letting corporate-overlord-acquired FOSS projects (MySQL) just slowly stagnate and die is a viable option as a way to kill it, at least in spirit, is a viable offensive option. I mean, who talks about the LAMP stack since the Oracle acquisition of MySQL. I mean, the Web was basically born on MySQL, and now its as good as gone the way of BerkelyDB. So, onto Redis and onto Mongo and every other db Couched as Meaner and better, but really full of sour grapes and traditional rdbms SQL-envy. There are however other enterprise-capable free and open source databases out there. Not many, but PosgreSQL, which as you'll remember is behind Amazon Redshift, which is behind our corporate Core database, meant to be a special competitive advantage. The reality of the situation is somewhat more difficult. You can't always adapt the tool to the people. So, we're now hitting core for fun. Okay, here's the project and it's various nuances. I Didn't have to, but I chose to hit core for two of these data pulls. It forces us to start getting up-close and personal with the data we're recording and processing for superior ad-serving and data-sales purposes. It's SQL. It's based on a product called Amazon Redshift, which is in turn based on a fork of a fork of PostgreSQL, or more commonly, postgres. Genesis of project: databases... are sometimes really terrible to have as a dependency in a process. Sure, they're great to keep a lot of data on-hand, but everyone ends up wanting to see the data in a spreadsheet eventually anyway, even if it's a save-as CSV link from the more interactive visual reports. So, if the perfectly capable, human readable and interchangeable... dumped into journal. Save, commit, and jump vm-side. -------------------------------------------------------------------------------- ## Sun, May 22, 2016 1:23:16 PM ### Priorities I just have to get a few things done today here at home, preferably while keeping Adi entertained and stimulated. Don't do JUST your own stuff, though. Pump up the warm & cozy aspect here at the apartment. What I'm about is not total cleaning... it's a great, functional and enjoyable environment. Remember, 80/20 rule in all things -- even this. Where does the first 80% of benefit come from today from my first 20% of effort... oh! Do a Levinux pull here on the SP4. Sun, May 22, 2016 4:37:14 PM I shouldn't get too used to Cygwin on Windows 10, because if the stars align just-so, the Ubuntu Bash shell and repository will be coming to Windows soon, and I'll get my backslashes and /usr/local/bin and all that happy stuff. Speaking of /usr/local/bin, one of my challenges I'd like to knock out of the way sooner than later is making the pip3 install location persistent without a backup. I'm learning how to make Tiny Core Linux extensions, but that doesn't really fit the bill here. Extensions are great for "installed" software and for common resources that should ALWAYS be there, but not in an always-changing form. Really, we want a location that is treated just like how /home is treated. I've already maxed out the mappable locations (as far as I know) provided by Tiny Core Linux bootcodes: - /opt - /home - /tce Now, there's a nuance here. These three locations are special locations provide for by Tiny Core for common use case -- keeping your work in home/username folders, or installing "optional" software that often chooses the /opt folder for "external" things like data, and the /tce location, which is where extensions get installed... and then "looped back" into the filesystem. Hmmm, maybe that loopback stuff has my answer. I don't quite need an extension (the pip install location will frequently be written to). But it's not a location that ALWAYS gets written to like /home. It's something in-between -- an occasional install, then frequent referring to it. Ohhh, one of the things I can do is make the whole Python 3 install go much quicker by writing my own local extension file and copy it over with tftp. I wonder how smooth tftp'ing goes AFTER the original Recipe.sh server build? I would imagine pretty well, because it's from the host machine INTO the guest, which seems to work a lot better with tftp (made for netboot) than the other way around (transmitting files from guest to host). The tftp server "lives" guest-side, so commands get issued most easily guest-side, pulling files down from known mapping-locations from the host -- space that's always found relative to the Levinux folder. Okay, so I'll write my own extension, keep it host-side in the Levinux files, pull it down and put it in-location during the Python3.sh build, which basically is an "empty" extension, but with a bunch of dependencies. These dependencies will be everything needed to run a non-X version of Python and all the components needed for non-graphical Levinux stuff. Same goes for vim -- I'm going to have to make my own vim extension. Hmmm. I wonder if's going to be as straightforward as I imagine: make a directory structure in temp, put a .dep file in location, and force a whole bunch of hand-selected stuff to install so as to avoid the exploding-X dependency chains that seems to double the final size of Levinux. The big culprits are vim and python3 -- one of the reasons it took me so long to do a first crack at a Python3 Levinux. Sun, May 22, 2016 8:54:43 PM Weekend winding down. Getting ready to drive Adi back to Staten Island. Need to look for helmet and scooter and stuff. I need to really keep aware that Adi exists on these 5 days of the week that I don't have her. I've been taking a super-deep sigh if relief that I get my first break in ten years. Adi's only been five years. Now, I get to be wholly and truly myself for the first time in a very, very long time, and I enjoy stretching my me-legs. The first order of business is getting my earning capacity secured. My second order of business is to feel invigorated and engaged in life -- preferably insofar as it ALSO relates to securing the first goal about stable income. I've got to be hot shit, and I have to be able to remain hot shit, and much of that comes from rapid understanding and embracing and ready competence with new technology. Think about Adi... always think about Adi. -------------------------------------------------------------------------------- ## Sun, May 22, 2016 12:22:16 PM ### The More Things Change - Virtual Screens Are The Measure of OS Elegance Okay, I'm on my SP4. Try to apply a bit of what you've learned on the Mac, lately. "True" full-screen? Nah, not worth it. THIS is true full-screen on Windows, because the default 3-finger swoosh does something much like an alt-tab, and recycles "windows" on the current screen. So many systems... and systems DO matter. There's is a battle of the systems going on right now... always has been going on, but is really heating up as the free and open source software community (FOSS) pulls ahead of the proprietary crap. Me re-positioning database-wise on PostgreSQL (after all these years since I've known about it) is symbolic. SQL Server turned me off towards SQL -- MOSTLY over the licensing nonsense, and then MySQL got bought by the Deathstar, as did Java -- and a whole world of database stuff (the whole M in the LAMP stack) became incredibly unappealing. And echos and whispers of PostgreSQL continued reaching my ears, as something different, something worthy, something BETTER than MySQL to fill the hole of enterprise-ready SQL database in the free and open source software space. But I was 10-years predisposed against it, because I thought it was only a port of Progress of the BTrieve days, and that Scala, Inc. had chosen it for their database -- a sure sign of its inferiority... haha! But no really, they've moved onto MongoDB, so it's okay finally for me to get bitten by the postgres bug... haha! Okay... let me think. Okay, I got the font-size adjusted so that this is coming across as full-screen. Oh, I'm going to have to hide the taskbar. Okay, set auto-hide and made the icons smaller. Hmmm, almost there. Can't get rid of the title-bar. I'm noticing things now about the Windows 10 implementation of stuff like virtual screens versus OS X's... in particular, the latest El Capitan stuff that seems to be working out the last-mile of elegant virtual screens. Apple's really done it right -- the integration between trackpad and OS really helps. I just don't think Microsoft or the GNOME or KDE people can rely on their hardware as much as Apple can, and the real elegance comes from that over-the-top assurance of control... the confidence that comes from controlling everything. BUT as in the case with the Amiga computer, the advantages of extremely tight coupling of components and extreme control only lasts a little while, as Moore's Law pushes hardware forward and forward and forward, until the state of affairs as they exist today look as technologically quaint as the Amiga days appear to us today. -------------------------------------------------------------------------------- ## Fri May 20 19:46:06 EDT 2016 ### Extruding a Book Well, well. I didn't think I'd be driving up to the Catskills tonight, but I am. And this time, I'll be sleeping at the incredible cabin of the in-laws... hmm, what will they be to me later? My child's grandparents, at very least. It's interesting to me to imagine that her grandparents on my side are actually Lynne and Chick. Been awhile since I've seen it spelled out like that. They were Chick and Lynne or Lynne and Chick to people who knew them. Heritage. Lineage... we are a composite of our parent's genetic makeup, and so on back through the generations. It's really an amazing system. How much of a just plain old information system are we really? Are we how true random expresses itself? I would think that truly self-aware matter, taking control of its own affairs to great, long-lasting effect would be one of the more substantial measures of success for an information processing unit such as our universe. I wonder if we've calculated any answers, or if it's all really like watching the blooming of an infinitely varied flower over and over. Okay, am I ready to start attempting to actually weave some mimetic hacks? I think it's nigh time. That's why I'm on the computer right now, after all. I have to start taking better advantage of the top and bottom of this file. I need a cut-off point, above-which will be chopped-off and extracted for use as the localhost:8080 page on Levinux, and a point below, which this is actually starting to become my book extruded off the bottom. The totality of the thing will continue to exist here as a one-long-page experiment, at mikelevinseo.com. Eventually maybe I'll start dumping some of my older journals in this space too, as time and technology allows. I have some going back to 1988 in paper notebook form, but one is actually in this very same git repository where I'm keeping this, just in the history of this very index.html file. I could always just clone it, go back to a commit-point, and copy-and-paste into a parallel copy at the repo head. And there's also my Webmaster Journal from my Scala days... hahaha! Wow, what a kick that would be. Anyhoo, time to copy-and-paste from Simplenote from my commute, then take care of the animals, and then get on my way. nformation-Agers of the Eighties The world is a lot smaller than it used to be. It was small when I was young. Only 4 billion. Mine is the generation that remembers pre-Internet. Pre information-explosion. Our age got shoved head-first into this brave new world that out-paced SciFi dreams, in only like 20 years. Some aren't even on Facebook yet, and never will be. Not me uhhhh uh. I live me my net. I couldn't wait to be on it. I wish I remembered with fondness George Robbins who lived with the CBMVAX. I just missed the age and generation of the VAX and PDP-legacy minicomputers. Not quite mainframes, but refrigerator-sized and the epicenter of the information age explosion. In short, I'm a no one -- an innocent bystander as my brothers and sisters paved the way of the information highway. I stood by, doing some interesting little projects as a non-scientist, non-engineer, non-professional programmer. I took up programming wholly for non-pro reasons. A person in a non-coding job who knows how to code... no expectations, so everything is gravy. And through those years, I have happened upon an information age bully are two. Oh those old-school Unix sysadmin god-complex assholes. Damn, did they not want you learning a little, because knowing a little could let you wreck a lot. The curious and the upstarts are best to crush-in-spirit sooner rather than later l, least they blossom into a flower in the same pot vying for sunlight. They put you down and choke you off. The power they could hold over you as admin is immense. It's the difference between getting back that days worth of work you just list, or not. The printer working correctly for you, or always giving you problems. And running code? Oh, you'll be lucky to run Word. And forget any server resources. And today, iPhones. And Raspberry Pi $35 computers. Some models are coming down to the $5 point. And you can be admin on any of them. Superuser status for all! Huh? Even the meaning and value of elevated permissions and access to computing resources is beginning unfathomable. That's just all the stuff in stuff--like phones and laptops. And laptops are mostly for surfing the net and watching video. Who could have ever dreamed that we'd me swimming in such overpowered massively interactive nearly magical battery powered wireless multimedia production studio powerhouses in our pockets, that it would all feel so commonplace. Our children are raised with this as just everyday things, as my generation was TV, and radios before that, and newspapers before that, on back to drums and smoke signals. It's all just self propagating and self-propelling information, let loose in an information substrate that is us. We humans are message receivers and interpreters and rebroadcasters. This age is ours to hack with an appreciation for what is there to be worked with. But what IS there to be worked with? How can you put together a little information nugget in such a way, as it comes to life? Thus-far, I have resisted all temptation to mess with you intrepid downloaders of Levinux. And another older phone entry, but also from today... Rally & Materialize Your Nix-Learners Tribe Step 1: Write a Memetically Evangelizing Intro "All things physical are information-theoric in origin, and this is a participatory universe." -- John Archibald Wheeler, Physicist Everything worth doing is a little bit difficult, or everyone would be doing it, and there'd be nothing special about it. Mastering that the old-school type-in Unix/Linux terminal that underlies everything, so that you use it as comfortably as speaking (like hackers in movies) is a valuable but disappearing skill. I propose to get you over the hard initial hump of mastering those skills, which are actually MORE relevant today than they ever were, along with learning an awesome programming language, plus the rich and often riviting stories behind it all. I believe in spending your time on love-worthy endeavors. For me, this is one of them. Your patience, open mindedness, and level of effort you're willing to put in are precisely what will start to incubate your internal high-tech super powers (think Ironman). But you will only get out of it what you put into it. But I have simplified things by only asking you to take up 4 tools: Linux, Python, vim and git, but Mastering any one of those could be a lifetime endeavor. And of those, vim is going to be particularly challenging to get started with, but rewarding (make all the difference) in the long-run. Once you've been at something for awhile, it ll gradually starts to make sense, so please don't give up right away! The word "technology" covers a lot of topics, so becoming more technical could mean a lot of things to different people. Here, it means Unix -- or nix-like systems. Few things are born perfect, and Unix least of all, but the more quickly and naturally we can implement, and then revisit past work to give it another go (iterate), the more we can zero-in on perfection, and gain a deep sense of satisfaction as others recognize and interact with our work. We will focus on gaining 80% of the benefit of our work from the first 20% of our effort. When standing at the edge of a rabbit hole, consider carefully before you chase it down, or you might end up in a world of nonsense, from which you never return. Stick with Linux, Python, vim and git as your always-on-you fallback tools, and you'll always have a way back from Wonderland. Here, we interact with with a specific version of Linux called Tiny Core. While it shares a lot with other versions of nix-systems, it is built very different. You are in one of the stranger outlying fringes of Linux distress. In particular, it was built to be tiny, incorruptible and highly compatible — these are all traits that I took advantage of in designing Levinux. Tiny Core Linux is the beating heart of the thing, but to get it running in the first place is a tiny free and open source (FOSS) PC emulator, which has been ported to every major desktop platform, and has some very cool networking context and ability to do a netboot, sucking its information from host to guest over tftp at time of boot up. This may sound like blah, blah, blah, but trust me, it's a pretty neat trick. Welcome to Levinux. Welcome to the free and open source software community. Welcome to the tradition of tech that went from clockwork-computers (Babbage) to vacuum tubes (Eniac) to transistors (PDP-1) to integrated circuits to MIT snobbery (ITS) to corporate dreaming (Multics) to practicality-driven skunkworks (Unix) to a Finnish freeing of those ideas (Linux) to a radical distillation of common parts into the smallest package both readily possible and practical. That's Levinux. I just stood on the shoulders of quite a few giants to connected a few dots so you can play with this stuff like a small toy. Congratulations. You're a few keystrokes from logging into an old-school type-in text terminal (much of the point here is about) to run the best computer language ever invented for making newbs like you feel welcome, while still setting you up for advanced lifetime technical Kung Fu power that applies to almost everything in life. Python truly rocks, and is displacing Java and Scheme as the introduction-to-programming language in school after school, right down to the legendary not-invented-here CompSci department of MIT. There was some crying when perfectly practical Python kicked out the home-grown wonderlanguage, LISP. So, I added Python as an optional add-on to Levinux, because nix alone isn't enough to tackle any problem that comes your way. Nix AND Python are. -------------------------------------------------------------------------------- ## Fri May 20 13:25:13 EDT 2016 ### All The Sysadmin Permissiony Things About PostgreSQL Worked Out Just get this first FDW success under your belt. I have clues, now. As my normal daily user on my Kubuntu machine, I can't get into /var/lib/postgresql/9.5/main... I have to sudo su myself to superuser first. And even then, I can't see anything with fdw in the name. There should be a file_fdw location or file_fdw_server location, I would have thought. But that path, which I found with the show data_directory; command doesn't look all that user-like. Okay, there's a bunch of stuff I'm just coming to realize -- not only about postgres, but also the sudo system. So, one of the first things to recognize is that (it appears) that users defined FOR postgres area also defined automatically for the OS. OS users and database accounts may be bound together from birth... or maybe not. But at any rate, the postgres user definitely exists on the OS, and the way to switch over to it is: sudo -u postgres -i This switches the actual command-prompt over to postgres, but you can see from the prompt, the postgres user itself is not a superuser on the system (and probably shouldn't be... little bobby tables). postgres@machinename:~$ Okay, but now we're getting somewhere. I AM THAT USER, and I know what files I can access now. So NOW go file the place to plop the CSVs that I can access with this permission context... cd ~/; pwd... I'm in /var/lib/postgresql and HERE I see a 9.5 directory, in which I see main and... ohhhhh, that path and directory I couldn't get in as miklevin. Duhhhh. Okay... almost there. Should the CSV files go somewhere INSIDE the 9.5 structure or outside? Well, outside, of course. I want to make the file location impervious to upgrade issues. But why shouldn't I be able to just give it file-read permission to /home/myusername/Dropbox/sharedfolder ? Okay, let me set up a new user and a new group and make that user part of that group. I used to always do sudo su, but the more modern way is apparently: sudo -s groupadd reporters useradd friendname usermod -aG reporters friendname usermod -aG reporters myusername passwd friendname Okay, now we're both in the reporters group. I need to give that group read-access now to a directory under /home/myusername/Dropbox/foo/ Fri May 20 14:24:57 EDT 2016 Okay, massive success! Success assured. I just gave postgres access to that dropbox location. Now, I have to turn the OS friendname user into a postgres user. Hmmmm... I think I won't mess around, and in the sake of time, let him use the postgres user login. BUT, it's still only accessible over localhost. Must change it to be LAN-accessible. http://www.faqforge.com/linux/server/postgresql/configure-postgresql-accept-connections/ Okay, this involves editing the postgresql.conf file, which is in: /etc/postgresql/9.5/main/postgresql.conf Any edits to this, and the db server is likely going to have to be restarted, which this file says can be done with a pg_ctl reload... from where? Okay, the way that appears that I can restart postgres reliably and without setting new paths to binaries is... sudo /etc/init.d/postrgresql restart Ah, Debian-style init.d/ folder. It may be baffling at first, but talk about one consistent API! Okay, I'll take it. Oh, and it has to be a member of the sudo superuser group, so my main login and not the postgres user. Okay, I just switched my VirtualBox networking from NAT to Bridged Adapter, and attempted pinging the VM from my the host using the bonjour.local style address, and success! BAM! Success assured again. -------------------------------------------------------------------------------- ## Fri May 20 11:27:53 EDT 2016 ### After All These Years, It's Still All About Drive Space Okay, the Windows 10 install worked (from Windows 7 on VMWare fusion on a 120GB hard drive, 4GB RAM, 1.7GHz i5, circa 2011 Macbook Air laptop. Wow! It's a squeeze, but it worked, and it's going to make for totally killer videos. But it's taking up 45GB of that total 120GB drive space, and that's a lot. I need to look at ways to shrink it, and free up more of my own drive space. One of my goals today will be to get Kubuntu back onto my Mac laptop, to set the stage for the most kick-ass talking-head Levinux/coding videos I've ever made. Project the appearance of: - Oh, he's a bit-tweaker... he knows nothing of rich desktop experiences. - Oh wait, he's on El Capitan, Windows 10 AND Kubuntu? Maybe he does know something. - Hey, how's he switching between all three so smoothly? - And in a screencast? - And while showing his every button press? - And while running even more subsequently nested virtual machines. - What is that, 6 total OS environments, with nary a hiccup? Ah ha! Freeing 4Gigs from Temporary Windows Installation Files, and over 5Gigs from Previous Windows Installation. And Settings / System is reporting 22.5 GB used, so chopping 10 off of that should be about a 12GB installation for Windows 10, plus another 10GB I'll give it for installs (Office 365? -- yikes). Now, research how to shrink the silly thing. The file is Windows 7 x64.vmwarevm, so it's a specific VMWare thing. That's what I chose on setup to ensure maximum "native feature" options. Here's the documentation. It looks like it's from the VMWare Workstation days... beware of outdated information! https://www.vmware.com/support/ws5/doc/ws_disk_shrink.html Okay, it's down to 14.6GB used on disk for Windows 10 under VMWare after upgrade. Okay, it involves the use of a program called sdelete from inside the guest: sdelete -z c: ...to "zero-out" your free space. Not exactly a defrag, but a good step to clearly advertise with zeros what space can be claimed back during a shrink process. Probably a full defrag would be best, but I've put enough into this. I just want 10 or 20 gigs back, so I can put Kubuntu back on. Well, shrinking virtual machines is a pretty straight forward process. My reclamation was nothing less than what I had hoped for. The whole Windows 10 VM is now down to under 15GB... not too shabby. I'm sure it'll grow big over time, but I'll lock in my Kubuntu-requires space right now! Done. Oh yeah, and run the Windows 10 Updater... ugh! Bigger grows the balloon again. Rinse and repeat the whole sdelete -z c: thing when it's done, just to get the final up-to-date pristine starting point. Yup... 18GB and growing, growing, growing. Wow, I wonder how its going to explode when the entire Ubuntu repository directory and bizarro parallel operating system is in there as well. Preparing to install updates... 23.2GB... just bought a 4TB WD NAS drive for home. It'll be my first NAS, and I desperately need it to continue to be a modern information savvy citizen. I need a place where I can dump hundreds upon hundreds of gigabytes, and not have to think twice. Definitely stationary and non-portable and at-home is the way to go. Get all the family album pics, such as they are, onto it, along with all your virtual machines, and phone and computer backups. -------------------------------------------------------------------------------- ## Fri May 20 10:20:53 EDT 2016 ### Friday Morning Musings On Missed Magical Moments Ugh, by Kubuntu VirtualBox didn't have Internet this morning, and I went about debugging my code to see why an API-call was coming back with zero results. Sheesh! I need better try/except output on the report.py script, particularly on the VERY FIRST API-call, which happens to be Search Console/Webmaster Tools, which happens to be flaky, having to go back in time to get the last-collected full day of data, so it threw me off. Thought it was an API issue when it was a local network access issue. Okay, and I am taking another crack at the Windows 10 upgrade under VMWare. I deleted all my recover-points for drive space, by right-clicking Computer and going to Properties and System Protection, then clicking Delete... oh, per this page: http://windows.microsoft.com/en-us/windows/delete-restore-point#1TC=windows-7 Who cares of links break that I leave in this journal. The journaling is the reward. Today's top priority is those friggin' data feeds, which I got so close to yesterday, but seemingly hit a permissions problem. I think it's the long user-oriented Dropbox filepath that I tried to read the foreign data out of, but kept getting permission errors. Okay, so today I find the most friendly local file locations for FDWs under PostgreSQL. By the way, where where you all my life, postgres? Oh yeah, under my nose! It's the database Scala used for their server components, and I wrote it off in my mind... probably because it was Scala... and probably unwisely, just as I had written off the advice of my buddy Guillaume Proux in Japan about using Python over Ruby, at the time I was first infatuated with Ruby on Rails -- circa 2004/5? Wow, it's over 10 years later. At least, I spent about 5 of them with Python. I'm not the guru I could have been, but neither am I on the wrong track anymore. Okay, the upgrade is up to Creating Windows 10 Media... all the stuff leading up to that has gone deceptively quickly. It's only in the final leg of the install when it's "checking a few things" that it gets irrecoverably hung. The bigger Windows and it's ilk get, the more appealing the "barely there" software stacks like Tiny Core Linux, and the stuff from the embedded world becomes. Levinux is really walking into an itch that A LOT of people need to scratch, but just don't know it yet. Don't screw this one up, like you did your generalized system -- basing it on VBScript and not thinking in FOSS software community terms right from the start. This time, I know better and can make the documentation OF what I'm trying to do PART OF what I'm trying to do... very meta. -------------------------------------------------------------------------------- ## Thu May 19 18:48:50 EDT 2016 ### Trying To Wrap Some Foreign Data - Pesky Permissions! Okay, I did the first two commands: CREATE EXTENSION file_fdw; CREATE SERVER file_fdw_server FOREIGN DATA WRAPPER file_fdw; ...and, of course so far so good. It's coming up on 7:00 PM, and I want to get at least one success under my belt before I leave today. By the way, every Windows 10 update attempt fails, getting stuck at checking for updates. I won't do another attempt until Windows 7 has totally cleared out all update attempts. They're hard to trigger off. My best bet yet has been this, then waiting: C:\Windows\System32> %windir%\system32\wuauclt.exe /detectnow Okay, but the real question now is can I execute SQL in psql from a text file? And how do I comment out chunks of text? Remember, \? for help. And I see this: \i FILE execute commands from file Well, that's a very strong indication of success here. I guess I want to write my SQL out long-form on this go-around, before I try figuring out programmatic looping and such. No problem; we enjoy typing! I'll say it again: we enjoy typing! Next? PostgreSQL dialect comments? Nope, standard SQL /* */ for multi-lines, and -- for single-lines. Done... get your first example bound table in place... oh wait, how to ensure we have the paths right. Oh, this should just be plain old Unix-style paths. Oh, so I'm going to use my original dropbox location! -------------------------------------------------------------------------------- ## Thu May 19 22:34:16 EDT 2016 ### Levinux now on Python 3 Okay, I want to think through next step on Levinux. It's becoming pretty clear, I just have to rework the initial the second-stage stuff that gets triggered by Pipulate.sh... first I have to change that filename to Python3.sh. Well, that about does it. Levinux upgraded to Python 3. Fri May 20 00:52:33 EDT 2016 -------------------------------------------------------------------------------- ## Thu May 19 22:04:49 EDT 2016 ### Didn't Master PostgreSQL In a Day... Alas! At home... late. Didn't catapult myself to full-fledged PostgreSQL admin in a day as I had hoped, but I'm close. My last leg of foreign data wrapper tables accessible as native SQL tables seems to be a permissions issue. Reading the O'Reilly book on the matter really drilled in the point of permission trickiness. Stayed late at work (also trying to upgrade a Windows 7 VM to a Windows 10 one). Diminishing returns on both fronts, and came home. But more convinced than ever, it's simple permission issues, like path access that the superuser account on the database-side may (by design) be limited from accessing with superuser privileges on the OS-side. Yup, sounds very feasible if all else looks good. What I want to do is make a user other than postgres, and give IT access to a file-location that's traditionally meant to be accessed by PostgreSQL accounts on the OS-side. I would presume this may be "table space" as defined in the database. These may be real absolute path locations (or relative to the postgres binary). Some stuff still to figure out, but this figuring out process is a sheer pleasure. I've been hungry for this since my SQL Server days, where I started down a path, then never completed it. I became database-shy, with all the nearly-a-database tech that stood in, from Python shelves to GSheets. -------------------------------------------------------------------------------- ## Thu May 19 13:24:16 EDT 2016 ### Actual Code To Create Foreign Data Wrappers in PostgreSQL Ugh, 67 more updates are recommended to install... and I haven't even gotten to the Windows 10 update yet. Yeesh! Maybe I should have found a way to do a fresh Windows 10 install without Windows 7 having previously been lurking underneath. After I'm done, I'm definitely going to have to purge the big ol' undo file-cruft that lets you undo a Windows 10 install. Then, I'll have to look into ways to shrink that ridiculously large virtual hard drive. Okay, have to just forge ahead to get stuff working today. Let's follow some net advice, concerning getting pgAdmin running, which I think will be very useful. First, assert control over the password for the default user postgres. I'll have to learn what putting localhost in that psql statement below does. I certainly want my desktop machine accessible from the network at large, and not localhost. Some discovery still to do. psql ALTER USER postgres PASSWORD 'blahsomething'; ALTER ROLE \q psql -U postgres -h localhost Password for user postres: (entered) psql (9.5.2) SSL connection (protocol: TLSv1.2, cipher: blahblah, bits: 256, compression: off) Type "help" for help. But at least now, I can: sudo apt-get install pgadmin3 Okay, I'm in pgadmin3 (loaded it as a desktop app / NOT through the browser), and I connected to the database using "localhost" as the Host field. That'll have to change. However, the port is 5432, and the Maintenance DB and Username are kept as the defaults, postgres. However, I use the password that I just set, and I'm in and can poke around the explorer -- very reminiscent of the old Microsoft tools I used to use in my SQL Server 6.5 days, but I have to say much better. I've been away for a long time, but I think my timing coming back is perfect. Okay, I see the "Scheme" called Reports that I made under psql. Thu May 19 16:46:08 EDT 2016 Just met with Adi and Rachel for lunch. Had ramen noodles finally at that place I pass almost every day, walking to and from lunch. Just had to delete XCode from my laptop to let the latest (maybe 3rd or 4th) attempt at installing Windows 10 to get past the latest download attempt. There must be a download cache that I can delete. This is ridiculous. I hope the drive size getting bigger isn't still just a one-way operation, if the amount of stuff being saved goes back down to sane levels. Does the expanding vmware virtual hard drive balloon also contract when data shrinks? I hope so. But back to the postgres thinking. Do this thing! Bind those files from today to foreign tables under postgres. I now have psql running in a full-screen window on my Kubuntu VM, and pgAdmin III running full-screen on the VM's second monitor output. It's remarkable how smoothly worked out this all is under a virtualization product. Point for VirtualBox! Okay, I won't JUST be a VMWare Fusion fanboy... even if it's an Oracle product I'm acknowledging... yuck! Okay, hmmm... 1, 2, 3... 1? One must be finding a sample table binding... wait, I already have that at this URL: http://www.postgresql.org/docs/9.5/static/file-fdw.html And this is the sample: CREATE FOREIGN TABLE pglog ( log_time timestamp(3) with time zone, user_name text, database_name text, process_id integer, connection_from text, session_id text, session_line_num bigint, command_tag text, session_start_time timestamp with time zone, virtual_transaction_id text, transaction_id bigint, error_severity text, sql_state_code text, message text, detail text, hint text, internal_query text, internal_query_pos integer, context text, query text, query_pos integer, location text, application_name text ) SERVER pglog OPTIONS ( filename '/home/josh/9.1/data/pg_log/pglog.csv', format 'csv' ); Or better yet, a sample from the book I just bought: http://www.postgresonline.com/journal/archives/250-File-FDW-Family-Part-1-file_fdw.html CREATE FOREIGN TABLE staging.aircraft ( Model Char (12), Last_Change_Date VarChar(10), Region VarChar(2), Make VarChar(6), Aircraft_Group VarChar(6), Regis_Code VarChar(7), Design_Character VarChar(3), No_Engines VarChar(11), Type_Engine VarChar(2), Type_Landing_Gear VarChar(2), TC_Data_Sheet_Number VarChar(8), TC_Model VarChar(20) ) SERVER file_fdw_server OPTIONS (format 'csv',header 'true' , filename '/fdw_data/aircraft.txt', delimiter E'\t', null ''); Mine will actually be a lot simpler, but I will have a lot more of them. There may be more elegant ways of not duplicating config code using the way tables have hierarchal parent/child relationships and all, but then I can also just barrel through this the way I know, and get it done. It's been enough friggin' discovery. Now, code! Put that same code example into... uh, a text file, of course! And make that a new private github repo... oh wait, I can use the existing "core" directory where the raw SQL queries are that actually CREATE the data in the first place (by virtue of their being sent to our big database in the sky). And so, I need a naming convention. Each one will be the text that I will be executing to create the respective data binding... oh wow, I could still be using my Dropbox location and simultaneously pushing those same data sources out to Marat... potentially giving Marat the ability to directly EDIT the post-processed data to take care of any data problems that may pop up to ruin a report's formating... how powerful! Okay, do an ls of the temp directory and redirect it to a sites.txt file in /core. Good, now do a couple of search & replaces in vim to edit out hyphens and .csv extensions. Okay, done. Now... now group the URLs and the Keyword file times, because they will be treated with different templates. Now, design the create foreign table code for each: CREATE EXTENSION file_fdw; CREATE SERVER file_fdw_server FOREIGN DATA WRAPPER file_fdw; CREATE FOREIGN TABLE schemaName.tableName ( keyword VarChar(50), otherField VarChar(25), andSoOn VarChar(25) ) SERVER file_fdw_server OPTIONS (format 'csv',header 'true' , filename '/fdw_data/tableName.txt', delimiter E',', null ''); And the rest of THAT work doesn't need to take place here in the journal, but rather, over in the private repo I already made for the SQL for this project... how unified! Okay... think! Cut this, and use mikelevinseo.com as the copy-and-paste channel for the above code template. Thu May 19 17:56:46 EDT 2016 -------------------------------------------------------------------------------- ## Thu May 19 11:27:00 EDT 2016 ### Beginning to Re-think the localhost:8080 Levinux Intro Text Okay, let those Microsloth Cruftdates finish in the background, while I forge ahead on the report stuff. They're done running, but now I need to manually check one of the property's files, and hand-fix the problem with multiple vertical bars (the pipe |) being interpreted as csv file delimiters by Tableau, when they're not. I could just replace every vertical bar in the file with an upper-case I. That would do it... hahaha, and so it shall be done... today only. Okay, done. But here's my subway writing, which may shape up to be the new localhost:8080 text on a Levinux initial build: # Levinux: your tiny computing /home ## INTRODUCTION Hello, and welcome to this very strange place, where you are admin. You are root. You are god here. And there's no real damage you can do, even with all this power. So the main barrier that keeps people from entering into the world of nix-like systems (Unix, Linux, etc.) has magically melted away. Follow along with me now, and shed the last bit of fear that you harbor internally that limits you. Everything in tech is really just moving text around. It may be given a pretty touchscreen user interface with bouncy gravity transitions, but underneath, it's just all text. Even when it's all hand-gestures, holograms and voice recognition, it'll still just be all text underneath there, with the only people who really understand (and control) what's going on, being the ones who can interact with an old-school text-terminal. Sure, dem's figtin' words to the excessively abstraction-inclined, and ideal abstractions do indeed have their very important indispensable place in things, but where would we be for instance, without Apple pushing forward the expectations of the Janeorjoe Average who just wants a cool phone that plays your music, keeps you in touch, and just always works? Steve Jobs has re-calibrated public expectations, and pried off all the arrow-keys from our keyboards... ...except, he couldn't. The heritage of computing on whose shoulders Apple stands so mightily has more in common with the text-inclined free and open source software community (FOSS) than it does the proprietary locked-down beautiful prison money-printing digital ecosystem about which it is so often accused of (exclusively) being. Go back to your desktop and open a terminal. If you're on Windows, you can use PuTTY, Cygwin's MinTTY or the soon-to-arrive Windows Ubuntu BASH shell. If you're on a Mac, just open a Terminal through Spotlight. And if you're on Ubuntu or some other Linux desktop, just open the standard Terminal. , no matter how hard you try. Every time you screw something up, you can just reset Levinux! At worst, you just re-download and get back to this point in a couple of minutes. What we're dealing with in tech these days is Unix. Make no mistake, there may be Cocoa syrup pored on top (Mac) or a monolithic Flemmish kernel underneath (Linux) -- and coming soon, a Canonically compatible core (Windows) -- all those computer system resources that are assumed to be there and are built-upon by free and open source software (FOSS) developers are directly lifted from the thing Ken Thompson and Dennis Ritchie created in the halls of Bell Labs in New Jersey circa 1970 (the state and time I was born) as a low priority office automation project called Unix. I feel that at that moment, great potential was created on two fronts -- potential that has suffered through great adversity, and fundamentally shifted and altered it's interior substance, while staying true to its original hopes and promise. But today, they are both finally winning through a combination of durability and worthiness that's a far cry from perfection, but it's good enough. Bell Labs themselves tried to supplant their own creation with a new Plan 9 OS, which got right everything that Unix got wrong the first time, and what are you now using today? And I've had my credibility attacked and assertions base about the possible effectiveness of my work, sometimes even with half-assed missing the essential point alternatives put forth. Whose version of "Unix" are you learning nix-like systems right now? And so here I am, an old dog (bit not nearly as old as Ken Thompson) trying to teach young pups a few old tricks -- because the old tricks are the really good ones. Old school is cool, and this is the computer-historically rich intro to a rich and worthy, yet rapidly fading from the public's conscience -------------------------------------------------------------------------------- ## Thu May 19 10:47:47 EDT 2016 ### windows 10 upgrade checking for updates taking forever Yep, that's a thing. My Windows 7 virtual machine, which I'm allowing to inflate to potentially a 60GB virtual disk image (my whole Macbook air hard drive haha) is already at 30GB, and I have to keep deleting and deleting stuff from the host machine to make room. Just had to delete my 8GB Kubuntu install (not my main now-daily Kubuntu work machine, which is on my work-provided Lenovo laptop), so I copied it to my 16GB keychain USB drive. BAM! Everything is flowing so naturally. It's ridiculous how much resources Windows gobbles up these days, system upgrade overhead temp-location requirements acknowledged, but my Levinux system, with full inflated (unzipped & running) software footprint is still under 70MB, with OS, Python, extensions, repos, local data stores, and even the QEMU PC emulator software included. There's GOT to be more than just a little unnecessary overhead in a Windows system, that is actually bad and working against you for a number of reasons -- not the least of which, is security. Forget about the time required for kick-the-tire adoption, for which a 20MB download, 60MB inflated/built software footprint, and 512MB RAM memory consuming virtual machine kicks Windows 10's ass. But I need to SHOW my tiny virtual machine running on Windows 10, and doing so on my Macbook Air, whose screen-size is just right for showing typing-text in YouTube, and whose software is just so perfect for swooshing between full-screen foreign OS virtual machines like Windows 10 and Kubuntu running under VMWare Fusion, and all being seamlessly screencam recorded, with all keystrokes captured (not just control-key sequences as with Camtasia Studio on my Microsoft Surface Pro 4) for a kick-ass impressive YouTube upload. My next Levinux demo will be flipping between 3 VERY MODERN OSes: El Capitan, Windows 10 and Kubuntu. I should definitely give thought to the preferred way to run Levinux under Windows 10, once the Ubuntu Bash shell arrives. Oh wow. Everything I tried to solve the "windows 10 upgrade checking for updates taking forever" problem didn't work, so I'm restarting windows 7, and I'm getting the blue upgrade screen telling me it's installing 156 updates! Well, I'm hoping that should help on the next attempt... haha! Something in there has to address the fact that it's hard to do the upgrade in some cases that Microsoft so wants us all to do, for support-purposes. Otherwise, we'd all still be on Windows 95, the first truly usable version of Windows (security, aside). I hope I'm going to be able to shrink my Windows 10 hard drive size back down after the install to get it to a more reasonable 20GB or so, so that I can get Kubuntu back on. This is all about that utterly magical video to dazzle and charm the informationatti cubs of the rising dawn of a new information age, in which people forget their hacker heritage and lost power. Oh yeah! My subway writing. Well, I'll cut the journal here and paste it up top because I don't want to mix it up with my windows 10 upgrade checking for updates taking forever issue. -------------------------------------------------------------------------------- ## Thu May 19 10:08:12 EDT 2016 ### Wow, it is 2016 Okay, here we go! I am so freshly charged up now, it's not even funny. Wow, are the old tech juices flowing. I needed reading that introduction to PostgreSQL in the O'Reilly book. It's been a long time since I've been on SQL Server, and I've so wanted to "be a fan" of some SQL database thingie. Nothing other than SQLlite ever seemed worthy over the years, but I never really gave PostgreSQL a good look either, being predisposed by the name as merely being a FOSS clone of Progress, an older proprietary database known for being a SQL shim over a table-oriented (pre-relational) b-trieve based ISAM (indexed sequential access mode) older database... called Btrieve. The software formally known as Progress is now the OpenEdge database... hahaha, open? Well, PostgreSQL, which appeared to have spoofed your name has inherited that crown... oh, I guess how Deadpool is more famous than Slade. I guess the silly one that does more insider-joke tricks wins. So, I'm running the reports now. I have to both run the reports "the old way", generating this morning's CSV files, clean up one of the properties whose output just happens to be triggering Tableau's re-definition of what the CSV delimiter is (only for those rows!). THIS is an interesting thing to have to debug right as I'm on the edge of an automation that will obsolete that. And so, I will just hand-edit the fix, rather even than coding something to proactively attempt to not trip-off Tableau's row-by-row re-evaluating what should be the delimiter character rules on CSV files... stupid. No concern, just generate the files, hand-fix, and focus on the Postgres Foreign Data Wrapper part of the automation. FDW's are a pretty awesome feature of Postgres, which is what led me down to the path of enlightenment that And I also need to focus on whatever postgres dbadmin stuff you need to know to make the data connection (on my virtual machine desktop - hahaha) available to Marat in San Fransisco over a local private network... wow, it is 2016. -------------------------------------------------------------------------------- ## Wed May 18 20:30:45 EDT 2016 ### Time To Change the Yellow Levinux Screen-1 Text Again Okay, next! I got the basic ASCII Art logo and messaging correct, but the yellow Matrix quote is no longer relevant. I'm going to try to really invigorate people to read certain books and do certain exercises. Let me re-write the yellow text. I'll do that right here. What do I REALLY want to say? Welcome, you sad little wannabe poser. You couldn't hack your way into this tiny Linux server running on your desktop even if I clearly wrote out the instructions right here. And don't dream to surfing to http://localhost:8080 Wow, got the messaging refined and inserted, and cleaned up ampersand ascii art. Wow, I'm getting back into the groove with Levinux. This is an amazing social hack meme thingie if I can really pull this off. Getting back into the right headspace and flow with Levinux is a good start. -------------------------------------------------------------------------------- ## Wed May 18 19:40:17 EDT 2016 ### Revised Levinux Messaging to Promote Connection to Python How best to use my every moment, now? Oh, that O'Reilly book about postgres. Wow, the intro alone is one of the most hard core evangelizing pieces of rah-rah, not even our shit stinks propaganda I've ever read. I LOVE IT! I've been waiting for this, since I've been driven off of SQL Server through licensing and moving on in life, and avoiding databases now that I didn't have to use them, and Google Sheets did so much of what I wanted out of Web databases. Managing lists online is managing lists online. The rest is security schemes and other details. Oh yeah, some user interface stuff, blah, blah, oh, is that a machine with Linux, Python, vim and git installed? Oh, I'm in business. Yeah, I got a database. It's called a Python shelve, thankyouverymuch! And now, the other side of the argument, I am finally open to, is being made to me now, finally and at long last. It is a perfect companion project to the tiny computing stuff. Oh, oh, a new tagline for Levinux: # Levinux: Your Tiny Computing /home It says it all! Wow, I think I've outdone myself with that one. Hmmmm, what do I do next? The ol' 1, 2, 3... 1? One would be finishing out some Levinux work. Bank a little bit of something, just to get you back into the swing of things. Check your last Levinux commit on github: https://github.com/miklevin/levinux/commit/601a2609d24e543828d39ed2cf55c4a05b2aef41 Okay, just git clone that down somewhere fresh. Revise the logo. Done. Wed May 18 20:29:34 EDT 2016 -------------------------------------------------------------------------------- ## Wed May 18 17:15:38 EDT 2016 ### Foreign Data Wrappers in PostgreSQL... Bingo! Okay, I spent a ton of time today talking and thinking and researching. In the end, I WANTED to use a Python Blaze Odo solution. I still might sometime in the future, but I need to install PostgreSQL (actually, already did) anyway, and in the course of doing so discovered a postgres feature called Foreign Data Wrappers. It's the lightest touch way of doing what I need to do: http://www.postgresql.org/docs/9.4/static/file-fdw.html Damn! This may end up worlds easier than I ever thought. I can keep these as separate data sources for now, so that Marat doesn't need to modify raw sql or anything. Just connect-to and pull in the full table contents. It could be the raw SQL: select * from table... and I just provide a bunch of tablenames that precisely correspond to the CSVs we've been using. Be meticulous about this. Follow the instructions precisely. Get your first success with a single CSV file. Ensure you can access it from elsewhere on the network. Replicate. Okay, I just bought the PostgreSQL Up & Running book from O'Reilly on Kindle. Don't really need it for this project, but it will serve me well, I think. I'll take it home and read it, and come to work much better prepared. I'm pretty confident about this foreign data wrapper thing, and think it's just firing off a few commands in the PostgreSQL command-line interface, which I'm already using to read the help commands and such. But the examples that are easily googled up assume too much requisite knowledge of postgres, so it's time for me to read up. Also, waiting out the Windows 10 upgrade on my VMWare Fusion instance of Windows 7, and its taking too long for me to hang around. Going to leave my laptop open and powered in a locked drawer to wait out the Getting Updates / Checking for Updates step, which is seeming to take longer than any other step... sheesh! -------------------------------------------------------------------------------- ## Wed May 18 10:26:33 EDT 2016 ### Windows 10 About to Become First Class Citizen in FOSS Ecosystem I just watched the video here: https://msdn.microsoft.com/en-us/commandline/wsl/about I can't wait for this change. It's almost enough for me to go onto the preliminary Windows release system, just to be trying this sooner rather than later. I can see Microsoft loaded with Apple envy in this development, with Apple having basically pulled off a technology miracle when they re-platformed their very OS from all the OS9 Apple-built proprietary stuff, onto a Berkely-derived Darwin Unix in OS X, which brought Apple firmly into the camp of the compsci god-nerds who are shaping the free and open source (FOSS) future. In this interesting move, Microsoft appears to almost be one-upping Apple, by taking a radically different approach of TWO simultaneously running full systems. Microsoft insists its not a virtual machine, but clearly it is using all the facilities built into the hardware for vitalization to partition off a parallel set of resources, accessible the moment you type "bash" into a normal Windows command console. And it's a more familiar nix than Apple's, with it's odd /Users directory and lack of /usr/bin/local for software installs. It looks like Microsoft gets the entire Ubuntu software repo for free (though, I'm sure they're paying through the wazoo for it), so no homebrew. This is a tour de force showing by Microsoft, with extreme cleverness, attempting to leverage their greatest strengths (Visual Studio?) in not quite too little, too late, but rather maybe just enough, quite but not entirely too late. Yep, that's about it. People who HAVE to stay on Windows for whatever reason won't be quite so alienated by it, when they can just open a bash shell and really truly do everything just like an Ubuntu user -- EVEN BETTER than on a Mac, where you're forced to choose between Homebrew, fink and MacPorts to get that same awesome FOSS repo facility -- none of which are actually official on Apple. But on Windows now, the entire Ubuntu software repo is about to be built-in and official. Shit, that's big. All the demos given in that video have a Ruby and WEBrick emphasis, showing me that they're targeting the most hopeless of nix users -- Ruby users -- who really otherwise had no hope of existing on the Windows platform, as they're all purists, and less tolerant of jumping through all the convolutions Windows used to put you through just to get Rails running. Consequently, the whole ROR scene has either become a Mac scene, or to a much lesser degree, a Linux desktop scene. Now, Ruby is about to become part of the Windows scene, in a very big way. There are issues. Services running in the background have issues, especially once you close all your bash sessions. There's still about a zillion more moving parts than an ACTUAL Linux system, so hold back on premature enthusiasm. Play with it the second it hits a mainstream Windows release, but don't make it your primary/only dev platform. Just replace Cygwin and Anaconda on your work machine, and CONSIDER not running your Kubuntu VM anymore on Windows. -------------------------------------------------------------------------------- ## Wed May 18 09:25:47 EDT 2016 ### Integrated Lifestyle & Loving What You Do It feels good to have done some journaling at home and then come into work to see it sitting right there after a clean git pull. I'm getting better to remember to do my last commit and push before I leave either work or before I go to bed at night at home. I heard a guy and girl talking as I came off the elevator after work at around 6:30 PM last night talking about someone they knew who loved their work and did it with a crazy intensity, and they proceeded to immediately zero-in on the very couple of realities that I have accepted for myself: ### Integration Everything is one big thing. There's no separation between parts of life, with some sort of isolation firewall. Vacation Mike is no different from doing-work-I-love Mike. In fact, I generally enjoy doing work that I love MORE than vacation. I view this as an integrated life that yields the sort of efficiencies you need in order to accomplish very big things, versus a disjointed life, for which the reset-button always seems to be re-pressed, setting you (me) back to something I'd characterize as "average" and unable to really make that ding in the Universe that I desire to make. ### Loving The Thing You're Doing Right Now The thing I'm doing right now is the one thing I'd like to be doing more than anything else in life. Yes, dealing with issues of making the best Python development platform for different use-cases, and then actually putting them to the test, DOING those projects they were designed to facilitate, is actually as and more interesting to me than just about anything else. It's the culmination of ages of work, whose heritage goes back to at least the mid 1960's. Inspired by these notions, I started to re-read The Cathedral & The Bazaar and see how much I have to read Steven Levy's book Hackers, Heroes of the Computer Revolution about the really early pioneering days of computers. Also, I am now fully re-platformed onto Kubuntu, which is the KDE distribution of Ubuntu. That means, I continue to get the Debian packaging system (more often known as apt-get) and the benefits of Ubuntu's amazing hardware support and mature drivers. Plus, I can always switch back to Ubuntu Unity or GNOME pretty easily if I turn out to hate KDE. In particular, I'm on the release of KDE known as the Plasma 5 Desktop, which is acquiring a pretty good reputation for having already worked out really well the things that Ubuntu is only just getting around to, like REAL mobile/desktop seamlessly integrated experience. KDE now sports the "burger button" of the HTML/Responsive Design world in both the upper-left AND lower-right of its desktop user interface. The functions under the KDE burger buttons appear to primarily be: - Show Desktop - Add Widgets (a native Android-like feature -- interesting!) - Activities (???) - Lock Widgets - Desktop Settings - Lock Screen - Leave It's interesting how I've been hearing about KDE as this other Linux desktop camp that's out there, that's been a mystery to me for so long. It was all I could do to reposition myself onto two primary Linux platforms: - Tiny Core Linux for an extremely minimal and lightweight embedded-style Linux - Ubuntu for the most mainstream (supporting much software & hardware) desktop Now that I'm totally comfortable with that, and having kicked the tires of many GNOME-oriented other Linuxes like CentOS, Mint and Fedora, I have settled now on the decidedly NOT-POSIX (by nature of the use of dbpkg over rpm for package management) Debian derivatives. And Ubuntu is the granddaddy of Debian derivatives, dealing with many of the crazy hardware support issues that the completely non-commercial Debian doesn't have the resources to tackle. Apparently, Canonical (the company behind Ubuntu) does have these resources, and some very formidable projects have decided to base themselves on Ubuntu rather than Debian, such as ChromeOS and even Microsoft's new BASH shell for Windows 10. It's the actual Canonical binaries that Microsoft is using... wow! If Canonical being in bed with Google weren't enough to solidify it's near-term future, then then deal with Microsoft certain seals the deal. Ubuntu in my mind, from these two deals alone, has now fully displaced RedHat as the one true commercial version of Linux that you can rely on for the foreseeable future. Oops, I guess the POSIX standard isn't the gold standard of "good" Linuxes anymore. Ah, Windows BASH... watch the video on this URL: https://msdn.microsoft.com/en-us/commandline/wsl/about Modern developers working on open source projects, today have to build things that span the gambit, and work on everything. -------------------------------------------------------------------------------- ## Tue May 17 20:13:03 EDT 2016 ### The Word is Noosphere Pshwew! At home. Thinking. I have time to think. Soon, I will have time to miss Adi on days more than the weekends. I will be dug out eventually. Hopefully, sooner rather than later. And at that time, I could be like, gee, I wish I had Adi occupying more of this space during the week. Who knows, maybe someday I will become independently wealthy. I could happen. It probably could have happened several times over for me by now, if I wasn't such a dumb-ass. But now that I'm 45, and have some little bit of wisdom under my belt, what do I think? I think it is a good time to be hungry and apply some battle-won mad skillz. And I am oh so ready to do that. Take another crack at the Python & Levinux ASCII art logo. These are interesting times, in that Python 3 has just barely poked its way into the Tiny Core Linux 7 repository, whereas Python 2.x has pretty massive support in there, with libxml2-python, among other things, sitting there ready as Tiny Core Linux repository packages ready to be installed. I wonder what kind of problems I'm going to run into repositioning myself on Python 3. I see from building my KDE system today, that getting pip installed alone doesn't get you off free and clear. Certain software capabilities have to be added at the OS-service level, like optimized C xml-parsing libraries that Python can create wrappers for -- one of Python's favorite tricks. Anyway, here's some writing from my commute home: It's Virtual Environments All The Way Down Finally using virtualenv. It's funny how my Levinux identical-VM-on-any-host-OS project and infatuation with the connecting the Tiny Core Linux and QEMU dots has made me uniquely not in need of virtualenv. Little bubbles don't need other little bubbles inside them -- at least, not the unnecessary ones. But now that I'm in a situation where I could benefit, I will benefit. Virtualenv, huh? Okay, so that's all there is. I'm glad I didn't need that extra project overhead of mixed contexts inside jeOS-box server nodes that are themselves dedicated to your single purpose. So, why virtualenv? How many unnecessary bubble isolation layers do you really need between you and the hardware? Everything costs something, even if it's just mapping overhead. Redirects by any other name are still a pain. One level of hardware abstraction, or 3 max. Three's my limit. And so, my thoughts come to whether Python is truly a first class concurrent language, every bit as ready for the big-boy jobs as Go Lang or Rust or some other such elegance in edge case languages -- but still not Python class. Man, I'll take that Dutchman's assessment of a usable API in almost all cases and times. Sure, he gave us Urllib2, but come on, when's the last time you're really typed it, given Requests? So why not requests in the core native packages? Because Python is not Ruby on Rails. It is Python on Python. It's one bad ass framework, both begging you and rewarding you for going that last mile in your own special way, optimized towards YOUR priorities. Getting those abstraction layers just right is important, let me tell you. It's the difference between never being able to learn to really program, and to enjoying programming as one of your most naturally flowing creative and flexible creative outlets. We enjoy typing. We are WET. Python is a wet language. Ruby is don't repeat yourself -- that is to say, DRY. If you like one master framework at the opinionated and signature-y final implementation stage, you'll love the RAILS community sharpening the edge on a joyfully DRY, CRUDy framework. It wasn't for me. But I'm living Flask. Flask is built on Requests and Werkzeug and Jinja2 and Urllib3. And all their creators are the benevolent dictators of their own sub-sub-sub-cultures. These people go to PyCON, and all know each other, can tell you the difference between PyPI and PyPy (enthusiastically) and are all pursuing dreams that somehow keeps Python pinned up as an inextricable part. Why? So many reasons, but they all boil down to on average really good priorities and decision making in language design by a certain Dutchman and the designers of the ABC programming language. These folk favor the convenience and time of the language user. It's a language that rewards it's speaker for using it by raising their general abilities and speed. These sub-sub-sub-dictators that spring up around Python and Guido are filling highly charged empty valence electron shells left, I presume deliberately empty to build his molecule and help make it self-replicating -- or, at least self-sustaining beyond the Great Pythonic days of the Dictator. And here I am with something new in the noosphere. It's time I carried it all out properly, using my best Kung Foo. foo, that is. Foo and bar, my two favorite variable names, second only to spam and eggs, which I can't get enough of. Indomitable spirit! Don't let others get you down! Just honk their nose like a clown! They win only when you frown. So turn that frown right on around. And that's all of that. I think I'll either create something crazy-awesome, or totally nuts. Free-floating neurons in a digital goo of primordial ocean. Wouldn't be surprise if an A.I. came skipping up out of the 3D printers one day, like Ultron. I hope Adi shares even just a quarter of my interests. -------------------------------------------------------------------------------- ## Tue May 17 13:48:09 EDT 2016 ### Turning my new Kubuntu 16.04 VM into New Dev Machine Doing settling-in with multiple machines. Want to start documenting my machine build process. - apt-get install tmux - echo "set -g mouse" > ~/.tmux.conf - apt-get install vim-nox-py2 Ugh, I forgot to apply the unique.py post-processing to the report procedure this morning. Almost caused an embarrassment by the titles not being in place during a big meeting happening right now. Okay, focus like a laser beam on getting this automation done. Okay, this feeling... more fundamental solutions imminent. The best solution is often to bear down on the better approach and just get it done. What unanswered decisions do I have? - Raw SQL - SQL Alchemy's ORB - Blaze It's coming up on 4:00 PM. Haven't even taken lunch. Two high-pressure things happened today. Shit, I wouldn't be good as a lawyer or anything. I spend a lot of my energies to make sure things don't become high pressure for me. Put in the energy now so that things rarely get this way again. You can't control for everything, but by having a little bit of time to allocate to things, you can control for a lot. Don't be risk-adverse, but when you take risks, make it more of a game feeling than any potential source of angst. How precisely do you bear down now on this new project? Ugh, I should complete my switching over to a different virtual machine platform. Always running out of space should be reason enough to stay your course in the 1 of the 1, 2, 3 step solution... 1: switch over to Kubuntu. 1. Put all the files that would otherwise be difficult to move around with the repository systems (because they're more private than github private) onto Dropbox... done. 2. Power-down the Ubuntu VM and power-up the Kubuntu one. Finish the system-build you had started before everything started blowing up today. - sudo apt-get install tmux - echo "set -g mouse" > ~/.tmux.conf - sudo apt-get install vim-nox-py2 Okay, now that I'm switching to Python 3 in general, it's time to start using virtualenv. Check what Python comes pre-installed on Kubuntu... 2.7.11+, of course. Try running Python 3 by name... yep: Python 3.5.1+. Okay, so it comes with both versions pre-installed. Okay, this is where the philosophy stuff really comes into play. This will factor also into how I do the imminent Levinux 3.1. We now want to switch over to pip ASAP and do things primarily through pip. When on VMs like Kubuntu, I will also use virtualenv's. When on ultra-tiny JeOS boxes like Levinux, I won't use virtualenv. - sudo apt-get install git - git config --global user.email "email" - git config --global user.name "name" - ssh-keygen -C "email" - sudo apt-get install python-pip - pip install --upgrade pip - pip install virtualenv Interesting! virtualenv hacks the shell once activated with: source venv/bin/activate You get a command prompt that puts the name of your virtualenv in parenthesis at the beginning of the prompt, in this case: (venv) miklevin@mykubuntuvm:~/projectname$ Interesting... so you activate with the word "source" and you deactivate it with the word "deactivate". So, the feeling of it is: source and deactivate. I'm going to skip virtualenvwrapper until I'm more comfortable with what virtualenv itself is doing. I don't want to mix the roles of these two things together. Okay, up to the Kubuntu 16.04 Dropbox install. Easy enough to get the Ubuntu 64 .deb file from dropbox.com. The, the trick is to surf to it, right-click and Open With QApt Package Installer. Okay, I got the note during the Dropbox install: > In order to use Dropbox, you must download the proprietary daemon. > Note: python-gpgme is not installed, we will not be able to verify binary > signatures. Okay, dropbox is installed on my Kubuntu. Now... next... 1, 2, 3... 1? Oh yeah, use those files that you installed Dropbox to move around! - pip install requests - pip install httlib2 - pip install google-api-python-client And now for the strange xlml stuff that needs to be installed both from the operating system side AND the Python repo side. - apt-get install python-dev libxml2 libxml2-dev libxslt-dev - pip install lxml So far, the stuff I've installed I would consider part of my "core" package requirements. Here's some stuff in there just to connect to core and reorder CSV files. - sudo apt-get install libpq-dev (required for psycopg2) - pip install psycopg2 - pip install numpy - pip install pandas Wow, the pandas install can take forever (over 10 minutes), and that's only after I decided to do the numpy install first to make sure nothing was going wrong with the binary compile step of numpy. Okay, what I HAVEN'T done today is pick the database technique and approach, and that's TONS of more moving parts that I have to dedicate almost a full day to. I would be well-served to get the unique.py process into report.py so oversights like today just can't occur again, manual process or not. And maybe what I want to do is actually ingest the already-created CSV files. There are really many approaches here. I really like having CSV files as a side-effect of this sort of processing. It's like having a paper trail. Okay, an important thing to remember is that virtualenv is bound to the particular shell you're in, so if you haven't activated virtualenv source venv/bin/activate -------------------------------------------------------------------------------- ## Tue May 17 10:06:48 EDT 2016 ### Kubuntu 16.04 installed on VMWare Fusion and VirtualBox (plus VMWare Tools) Running the reports. Final leg of automation ahead of me -- today, I believe! Wow, what a perfect challenge to put in front of me right at this critical moment. I wold LOVE to get Pandas out of the picture to reduce the heavy-duty dependency of this project. One thing I want to do straight away, is to make my Macbook Air at the office into an awesome VMWare Fusion multi-OS testing environment. And to do that... hmmmm... I get one Windows and one Linux running. There's already an Ubuntu 14.04 under VMWare, and I'm doing an gratuitous apt-get update/upgrade just to see it through, but chances are I'm going to dump it for Kubuntu 16.04 in a moment, depending on hard drive space. I'm also about to put Windows 7 on it, and give the Windows 10 upgrade offer a good think. Okay, installing Kubuntu 16.04 now, which I'm hoping has the Plasma 5 Desktop, while the morning reports run. Oops, I do have a space issue on this machine. Deleted Ubuntu 14.04 install and my Windows 7 ISO file-copy. Oh, haha, after the Kubuntu install, the way to get VMWare Tools installed is an apt-get! sudo apt-get install open-vm-tools-desktop A ha! as I thought. When working with VMWare Fusion, never bother to try to get the screen resolution to match right after a fresh install. Just get VMWare Tools installed! The next reboot makes everything look magically correct. Okay, not only did the Kubuntu install finish and the VMWare Tools work to make it pretty, but I also did a test download-and-run of Levinux, which works under KDE just as well as I had hoped, and has the added benefit of no ambiguity on what happens with the double-click of a .sh file. You just alt-click and select Actions / Run in Konsole Okay, that was a complete success. I have Kubuntu 16.04 now on my Macbook Air at the office, and can start to get my KDE experience, and Plasma 5 Desktop, in particular. But I'm also now putting it on VirtualBox on my PC. Okay wow, I am set up pretty well now both on my Macbook Air and on my Lenovo Thinkpad, as far as Kubntu desktops are concerned, and I also installed Windows 7 on my Macbook in preparation to going back to the talking head videos that work so well with VMWare Fusion and Screencam. Hard drive space I believe will be a perpetual issue. -------------------------------------------------------------------------------- ## Tue May 17 09:49:57 EDT 2016 ### Making My Python 3 Move & Planning Pipulate 2 Wow, shot a great video this morning, laying out all my thoughts in video format. All those little dreamed-of projects gradually coming into focus... Levinux - Tearing down silly obstacles between you and your dreams. I also had the big realization that Anaconda, and it's accompanying "conda" command, is really just pip for Windows... duhhhh! I wonder if I can use Anaconda under cygwin. I have to remember to dump my morning subway writing here too. It's as good as anything I tap-out at the office: Python 3 & Levinux It's time for Python 3 and an approach to Pipulate's that: - Streams progress to browser when web-based - Streams progress to console when shell-based - Alerts begin-and-end when desktop-based So, this is more than just a web service with some request/response API. This is something of an educational little show. Using Pipulate should educate you about Python, allowing you to visualize the movement and transformation of data. Therefore, the time spent in processing a large line oriented batch request should be highly "exposed" to the Pipulate user in a cool animated fashion, gradually teaching them Python over time. I feel certain project kinship with IPython Notebook, intended greatly to teach -- as is Python itself. It's all about alignment of capabilities with endeavors with propensities. Python is chatty and informative, of the right information at the right time, delivered in the right fashion. Pipulate uses the logical, easily accessed input/output mechanisms. This is something that opens a chatty communication channel while you wait for its work to complete, THEN delivers the finished product, using the very same chatty response stream (Python generator yields) I finally figured out how to get pip to install under Python 3 on Levinux, and am now both perched and juiced for my next step. Doing it all under Levinux on Python 3 is a really outstanding way to frame my next round of work. -------------------------------------------------------------------------------- ## Tue May 17 07:21:15 EDT 2016 ### Python & Levinux I'm going to be shifting the focus of Levinux away from Pipulate (until it's re-implemented) over to Python. So, be ready to change the bootup screen to: echo -e "\e[1;34m \e[0;37m... " echo -e "\e[1;37m A tiny virtual server to help you learn. \e[0;37mx88\" !. " echo -e "\e[1;34m ____ _ _ \e[0;37m888X 8 .-=x. " echo -e "\e[1;34m | _ \ _ _| |_| |__ ___ _ __ \e[0;37mX8888. X\" :'.H88L " echo -e "\e[1;34m | |_) | | | | __| '_ \ / _ \| '_ \ \e[0;37m?88888X. f 4888\" " echo -e "\e[1;34m | __/| |_| | |_| | | | (_) | | | | \e[0;37m.x*888888hX \`\"\` " echo -e "\e[1;34m |_| \__, |\__|_| |_|\___/|_| |_| \e[0;37md8 \`?8888888. " echo -e "\e[1;31m | | \e[1;34m|___/\e[1;31m _(_)_ __ _ ___ __ \e[0;37mX88L \`%888888k " echo -e "\e[1;31m | | / _ \ \ / / | '_ \| | | \ \/ / \e[0;37m8888x ?88888> " echo -e "\e[1;31m | |__| __/\ V /| | | | | |_| |> < \e[0;37m888888hx.x! ?888> " echo -e "\e[1;31m |_____\___| \_/ |_|_| |_|\__,_/_/\_\ \e[0;37m'*8888888\" '888 " echo -e "\e[1;31m \e[0;37m\`\"\"\"\"\" .88% (\e[1;37mLevinux.com\e[00m)" -------------------------------------------------------------------------------- ## Mon May 16 15:47:24 EDT 2016 ### Getting pip3 to install finally under Levinux with Python 3 I need a fresh mindset for a moment. Solve simultaneous equations, but don't over-complicate anything. Pandas and numpy as a Levinux dependency? I don't think so! But maybe. How lightweight could that be? Nothing happens until Levinux is running Python3 with pip3. What was that last error? Hmmm, it was... File "/usr/local/lib/python3.4/distutils/sysconfig.py", line 437, in _init_posix raise DistutilsPlatformError(my_msg) distutils.errors.DistutilsPlatformError: invalid Python installation: unable to open /usr/local/lib/python3.4/config-3.4m/Makefile (No such file or directory) ...which makes me think it could be a coreutil build thing. So try first: tce-load -wi coreutils Mon May 16 17:31:57 EDT 2016 Nope! But I'm on the trail. Before I leave today, I will have a very good notion of what I'm doing. Remember this location: /usr/local/lib/python3.4/site-packages/ It put the file pip-8.1.2-py3.4.egg in there... VERY good sign! Success! Here's the process... tce-load -wi python3-dev wget https://bootstrap.pypa.io/ez_setup.py sudo python3 ez_setup.py sudo easy_install pip sudo pip install requests And now remember that strange way to get busybox ssl server to connect: ssh -oKexAlgorithms=+diffie-hellman-group1-sha1 tc@localhost -p 2222 Okay, let's rebuild a Levinux instance from "scratch" all the way to a pip3 install... all worked! I've got a clear path forward now. Also remember: /usr/local/lib/site-python/ ...and that you can do this: import site site.getsitepackages() Okay, now that you have this capability, you should forge bravely ahead. I wonder if you can install IPython under this. I wonder if I should always get coreutils on Levinux. Hey, I wonder if I can install IPython Notebook... sudo pip install jupyter Mon May 16 18:16:06 EDT 2016 Well, it's looking good, but I want to head home (not wait through the whole install). But I am very equipped for taking Pipulate and Levinux to the next level now. I think when I restart my tutorials, I'm going to do all the work directly on Levinux, so everybody can follow along and have exactly the same experience. I don't think it will involve Jupyter Notebook though... I'll still be keeping things on the very lightweight side, after all. -------------------------------------------------------------------------------- ## Mon May 16 12:58:36 EDT 2016 ### Duck! Okay, Rachel called me at the office because Adi had a tick from the Catskills which I should have caught. Now, I admit that readily, and I will from this point forward diligently go over Adi for ticks, but calling me in the middle of the day over it... does she really not know me by now, or just not care? I just matter-of-fact okay'd my way through it. Let shit like this fizzle, and get back into your zone, groove, flow or whatever. Accept the criticism, and ignore the fact that it was delivered as a little mid-work-day grenade. I did have the option to not call her back, which I probably should have done. That message could just as easily have been delivered after work, so fuck it. And back to our regularly scheduled way to not drown after having disgustingly over-extended myself over the past 10 years... work! Your last bit of time now should be spent getting the data as easy to work with as possible during the work-session coming up at 2:00 PM. - Delete all my Filter Views. - Add a Filter View for Pri1Menu - Add a Filter View for Pri2Menu - Create a Highest Volume filter view Mon May 16 14:44:45 EDT 2016 Ugh, had the menu meeting, and didn't go nearly as well as I would have liked. After that, I still felt unprepared. Ask yourself: Can you fully define the menus for a site algorithmically? Can you completely follow the data and suggest a drop-down menu? Something that could be seeded with preferences, and automatically infer hierarchy? Hmmm. I'm on the cusp of making some very important decisions for my career. I'm either going to end up a bozo in a lot of people's eyes, or just perfectly ahead to be force-of-nature, mind-bogglingly perfect for the times. And of course, I would like that later scenario, but... but... I took the wrong approach at first, with this last menu project. I didn't... what? Didn't fully internalize importance and challenge of the projects. Do so now! -------------------------------------------------------------------------------- ## Mon May 16 11:41:41 EDT 2016 ### Menus, Menus and More Menus Okay, let's just the Priority column of the Menu tool to make: - Featured menu = Priority 1 - Explorer menu = Priority 2 Demote all 2s to 3s and all 1s to 2s... okay, done. - If it gets a "1", it gets onto the High Visibility Main Menu - If it gets a "2", it gets onto the Lower Visibility Explorer Menu - Things that need a 1 and a 2 should be duplicated in selector area -------------------------------------------------------------------------------- ## Mon May 16 11:16:42 EDT 2016 ### Angling In On An Exciting New Chapter of Pipulate & Levinux Thinking about Levinux and Python3. This will be quite a feather in my cap, updating and re-implementing Pipulate under Python 3 on Levinux. Keep this in mind: tce-load -wi python3 wget https://bootstrap.pypa.io/get-pip.py mkdir /mnt/sdc1/tce/pip/ ln -s /usr/local/bin/pip/ /mnt/sdc1/tce/pip/ sudo python3 get-pip.py Okay, this got me up to the next error: File "/usr/local/lib/python3.4/distutils/sysconfig.py", line 437, in _init_posix raise DistutilsPlatformError(my_msg) distutils.errors.DistutilsPlatformError: invalid Python installation: unable to open /usr/local/lib/python3.4/config-3.4m/Makefile (No such file or directory) Ugh! Okay, getting closer. Inching in on the solution. Try this before install: sudo ln -s /usr/local/lib/python3.4/ /mnt/sdc1/python3.4 Same error. Okay, I'm going to have to follow up on this, but I'm zeroing in on the correct area. All I have to do is make the pip3 install under Python3 under Tiny Core Linux, and make that pip install location persistent, and then it's success assured, and an invigorating new chapter of Pipulate and Levinux. -------------------------------------------------------------------------------- ## Mon May 16 09:13:20 EDT 2016 ### Getting Started Monday Morning, tmux Optional on Journal Terminal Not running tmux on my main journal vim session anymore -- either at work or at home, because of... well, a few reasons: - Full-screen OS X mode is really awesome, and the 3-finger swoosh is a better way to switch terminal sessions than tmux. - I don't remap keys very much, either in vim nor presumably much in tmux, and ctrl+b conflicts with a common "down one page" vim shortcut (a "gentle" conflict that just makes vim require one more ctrl+b) But in the end, I'm stripping away and tearing down the "extras" to the minimum workable environment, and while tmux will probably be part of that stripped-down bare minimum in many cases, it does not extend to my everyday "home" of my daily work journal, which deserves it's own simple and dedicated terminal window. Mon May 16 09:56:14 EDT 2016 I had some trouble getting my virtual machine running this morning. Probably a result of my difficulty getting Kubuntu Plasma 5 desktop to run, and the subsequent attempt to uninstall it when I saw I had file space issues with my 8GB virtual drive. Sheesh! Be careful with that. You can't risk the reliable running of the reports, which I have occurring now. Now, I only have a couple of hours to prepare for the Menu working session meeting. Pair things down and prepare a good session. -------------------------------------------------------------------------------- ## Sat May 14 13:54:15 EDT 2016 ### Techniques - Think out-loud into vim naturally - Think and code in Python naturally It's amazing to realize how much good stuff I/we have once I take the time to excavate it! Wow, it's fun hashing my way through life, thinking out loud into vim. I'm going to leave a better trail and evidence of my existence than merely just hashing my way through life silently. Robots... clones... but probably mostly robots are in my future. I can imagine a dozen or more different types that are going to come within the realm of possible in my lifetime, and I may just be the one to implement good, viable working versions of them first. Legacy perpetuators and such. Self-perpetuating, spontaniously reiniitializing family internets made up of a mesh network of self-repairing, and capable of suspended-animation robots... for starters. Getting close to leaving for the catskills. Called the garage and told them I'll need the car at 2:30. Good compromise right there. I get what I want, Adi gets what she wants, and the grandparents get what they want. - Always think ahead - Always be defending - Dispatch with things right in front of you that are currently easy to do, right away before they become difficult - Repeat I will not respond to emotional terrorism. Shit, I've got to protect myself from some very nasty techniques. Be above it. Separate yourself from it, and try to see yourself in context, in that actual objective world we all occupy. -------------------------------------------------------------------------------- ## Sat May 14 12:19:09 EDT 2016 ### Figuring Out How To Teach Adi Important Things About Life Okay, Adi DOES want to go to the Catskills today sooner rather than later. I will only get done the necessities that have been slamming and jamming me up this week, getting the coins of the floor that she spread around my bedroom, putting my clothes away in my bedrroom, and doing the dishes. Okay, dishes and coins done. Only my clothes remain. Make 3 Ikea Bags for socks: 1. For all my white & grey tubesocks 2. For all my dress socks for work 3. For all other socks, like Adi's & Rachel's The thing is to not let the desires of Adi (and Rachel still, for that matter) have too much of an impact on my getting done the things I have to get done to be an responsible adult human being. The strategy over the hears has seemed to have been to run away from whatever place enough crap has accumulated in order to make it upleasant to be at, and a stressful chore. We ran to the Catskills on the weekend to get away from a shithole, that $180 a week barely broght back from the edge of collapse. Even just the 80/20 rule passes are sometimes something of a chore. Another screaming rampage from Adi from me doing a little organizing of the shit-pile... stood up to her. I can stand up to her much more easily that I could to Rachel. It's not going to end in a divorce-size fight anymore, over little matters of doing the things every responsible adult has to do. I have to become that force-of-nature with my own child that I'm attempting to claw my way back up to at work... first time, probably since Scala, 15+ years ago. Oh, how women being in your life changes things. I feel myself under the stress of "being broken" like a wild stallion. Whenever I would talk about the need to "break" things like dogs to housetrain them, Rachel would get angry, but I see it was the very thing that was being done to me, and now that she's (almost) gone, really only Adi is there to attempt to carry on the tradition, but I'm just simply not putting up with that nonsense anymore. I am allowed to clean, and I am allowed to use the ONLY decent chunk of time that I have to do a little of it -- on a beautiful Saturday afternoon where we COULD run away from it all to the Catskills... which is just another shitty mess. My observation is that anytime I try to do anything to better my life that's not on work-hours or during my sleep hours, there is somebody or something running interference, trying to get me to stop doing that thing. I just wore down Adi, and she's "taking a nap" because I wouldn't keep her full-time entertained. Shit, I can't believe it's come to this. Aside from weekends, there's ~7:00 PM to about 10:00 PM every night, during which time I eat and decompress for the day. I have to start using that time during the week more wisely, which I haven't done lately, because I've been working late at the office every night, to try to get myself professionally back to where I need to be to keep paying for the... uh... my lifestyle. Ugh, what a shitty situation. I should get more things my way now, and my way is to dig my way out of ten years of crap, starting with my immediate surrounding personal space of the bedrooom... continue with clothes... The lesson in ife here is never believe assurances. Only believe your eyes. The observation game is much more important than even I'm making it out to be. The reality in your head is the reality in your head. There is an objective reality that we both share, in which shit is piling up and you're not doing anything about it, and I seem to be the only one suffering from it. And then there is the reality in my head, where I (use to) want what just made everyone happy, but now I'm wising up, and I have to want what makes me happy first. If I can be happy, and other people can be happy as a corrolary effect, then all the better for everyone. But other people's happiness can no longer be at the expense of my own... period. -------------------------------------------------------------------------------- ## Sat May 14 11:56:21 EDT 2016 ### Possible Book Name: Chipping Away at Python Hello, and welcome to Chipping Away at Python, where everything is a small and understandable baby-step, where I walk you from not really even understanding what Python is, to what I hope will be something of a high-tech Super Power, 5-minute tutorial by 5-minute tutorial, which I will broacast over many channels (YouTube screencasts, web pages, podcasts, animated whiteboard drawing sessions and the like) so that learners of many types can follow along, and comfortably jump onto the Python bandwagon. -------------------------------------------------------------------------------- ## Sat May 14 11:42:44 EDT 2016 ### You Have To Live With The Results of Your Decisions This doesn't have to be a difficult day. It can be a casual, relaxing and enjoyable day. No rush, rush, rush. And not OCD approach to cleaning. Just a series of 80/20 rule passes that leave me at a MUCH better place for starting the week again on Monday than I am at now. I am in danger of becoming perpetually fatigued, putting in 110% at work, 110% percent with Adi on weekends, and then short-changing everything else that's important too. Be careful. Much about what it is to be human is having and showing the proper amount of restraint at the right time. Don't just do everything that pops into your head. In an argument with Adi last nigt (me not wanting to wrestle because I was tired at the end of a long week), she picked up my Surface Pro and tried to smash it on the floor. It's the meanest thing I ever saw her do, and I'm not going to let her forget it. That's going to be her punishment -- when she wants to wrestle and I say no, I'm going to remind her of the time that she tried to smash something that was special and important to me. That's a pretty big punishment, I think and one that she will not soon forget. I know how big that can be in a child's mind, so I will also not overdue it. But the point is, that she has to learn to not demand everyting she wants and then try to punish those around her in mean spiteful ways without expecting consequences. You have to live with the results of your decisions. -------------------------------------------------------------------------------- ## Sat May 14 10:21:09 EDT 2016 ### Helping Adi Find Happiness Within If I dedicate today to getting organized... finally, a day in my life to do that, then I have to make sure that: 1. I really do get stuff done 2. That I'm still helping Adi to be happy This "make her happy" and "make her angry" shit has got to stop. She's picked up too much that other people are the source of her happiness and feelings. She's going to be a very unhappy person in life if that particular incorrect perception persists. Nobody "makes" you happy or "makes" you angry. You are either just happy or angry inherently, and events around you give you reason for letting out the feelings within. It is my job to ensure that her feelings within are generally happy ones, by helping her to discover things about herself, and I guess the human condition in general, insomuch as a 5.5 year-old can comprehend them. -------------------------------------------------------------------------------- ## Sat May 14 11:54:19 EDT 2016 ### Cleaning, Catskills and Pipulate Saturday morning, not even 10:00 AM. I can see that nor from a full-screen terminal window, because I'm using it through tmux (even though it's not a subdividided screen any longer. I'm going to use tmux to make a left and right border to make my journal just the right size, running down the middle: Yeah, that's the ticket. This is getting really geeky. It's strange to have a terminal session on each side of my journal as a border, but the benefit is running tmux (and the shell) in true full-screen mode, OS X style, which is very cool -- allowes you to focus by stripping away all the OS stuff. Today is one of the few days in 10 years or such, where I have been able to go to sleep early enough (Adi actually got tired at a reasonable time), and go to sleep and get as much sleep as I felt I needed (cats waking me up to feed them the only interruption. And today, if Adi doesn't really want to go to the Catskills, I can clean almost from morning to night. We could always go to the Catskills for a half-day tomorrow, and I drop Adi off with the grandparents. That would be a good compromise. Give me what I need, and give Adi a chance to see her friends. In the meanwhile, get your coffee and plan out your day with a number of 80/20 rule passes. And think about Alice and Bob -- perfect framework to re-introduce Pipulate. - The ability to run jobs of fairly large size (20K to 40K rows) fairly quickly. - Think about concurrency when appropriate, but don't predispose the project by necessitating concurrency. - Speed primarily comes from not having to go through the GData API row-by-row, but rather favoring chunky processing. - This will require more persistence on the back-end than I've been willing to give it so far, but not between "runs", so no problem. - The Web Service model with be the primary Pipulate interface, with everything else going through it (or mapped to it / non-http). - Something like a CSV gets submitted. Sometihng like a CSV gets returned. - Avoid any other meta data on the request. Convention rules. - When convention needs to be overridden, it can be same-sheet instructions utilizing the first few rows. - Pipulate needs a way to interrogate a datagrid to see if there are embedded meta instructions in the first few rows. - This followes "directive" rules, much like text files that lead with a shebang. Of course! If a shebang is in A1, then overriding directives exist. -------------------------------------------------------------------------------- ## Fri May 13 13:24:45 EDT 2016 ### Going to get Python 3 with pip3 running on Levinux - tce-load -wi python3 - wget https://bootstrap.pypa.io/get-pip.py - sudo python3 get-pip.py File "/tmp/tmpsoh8cxbi/pip.zip/pip/_vendor/distlib/util.py", line 401, in write_binary_file with open(path, 'wb') as f: OSError: [Errno 30] Read-only file system: '/usr/local/bin/pip' Okay, near the end of the install, I get the above error. I think this is a blessing in disguise. I'm going to be able to both make this location write-able using Tiny Core Linux and QEMU mount-point tricks, but I'm also going to be able to make everything that's written in there (presumably, this is where pip drops all its install files), but I'll be able to make things installed with pip persistent between Levinux sessions without a backup. -------------------------------------------------------------------------------- ## Fri May 13 09:45:47 EDT 2016 ### Upgrading Ubuntu Desktop to KDE Plasma 5 (Kubuntu) Okay, I've heard enough about KDE Plasma 5. Time to take the plunge, especially since it's just only another apt-get install over vanilla Ubuntu. Okay, first we add the Kubuntu stable repository to our list of usable repos, per the instructions at:http://ubuntuhandbook.org/index.php/2015/08/install-kde-plasma-plasma-5-4/ sudo add-apt-repository ppa Ugh! No space left on device. Crap. Uninstall all that crap, and be ready to do another fresh virtual machine later. This is the rabbit.:kubuntu-ci/stable sudo apt-get update sudo apt-get dist-upgrade sudo apt-get install kdm sudo apt-get install plasma-desktop After a restart, I should have the Plasma 5 desktop! If I don't like it, and want to purge it, the commands will be: sudo apt-get install ppa-purge && sudo ppa-purge ppa:kubuntu-ci/stable Oops, the kdm package has no installation candidate. Trying right for the desktop... okay, I need: - kactivities - plasma-workspace - kinfocenter (recommended) So... sudo apt-get install kactivities sudo apt-get install plasma-workspace sudo apt-get install kinfocenter Doing that WHILE the reports run this morning. Make sure the reports completely complete (including the post-processing title uniqueness update) before rebooting your Linux virtual machine. It may be unrecognizable, but this is what keeps things fun and interesting. I won't just be on Ubuntu, with the greatest repository on Earth, but I'll be on KDE, with (reportedly), the best desktop UI on Earth. So, when I explain what's going on with my machine, it goes well beyond "I'm running Ubuntu". Now, it's "I'm running Kubuntu, which is Ubuntu with the KDE Plasma 5 desktop manager"... pshwew Okay... the reports are half-way run. It sounds like Adam Whippy will be stopping by around 11:00 AM. Get as much of the menu work as you can before things start again, fast and furious meeting-wise today. Fri May 13 12:05:24 EDT 2016 Adam Whippy just stopped in for a visit. Wow, do I love that guy -- a true SEO peer -- unlike many of the posers in the SEO-space. Still doing Python programming, and taking only a few clients and "going deep" as if a full-timer and in-house. Excellent SEO consulting strategy, but you have to find just the right SEO-clients. He's got such a good approach to things. I don't have a lot of time before meetings start, and I'm really living on the fault line doing this stuff, but rebooting Ubuntu, which SHOULD come up as Kubuntu... but didn't. Ohhhhh... not just a restart, but a logout and clicking that little circular icon near the login field. And there it goes, but I get the message: All shell packages are missing. This is an installation issue, please contact your distribution. I had to send it a control+alt+delete to get any interactive user interface, and that was only the logout option, which I did. Switching back to Unity to do a few apt-get updates and upgrades. Done, and no help. Google the message... ugh. Okay, there's some ambiguity surrounding: apt-get install plasma-workspace vs. apt-get install kubuntu-desktop The first didn't work so I'm trying the other. The other appears to be installing significantly more than "just plasma". I don't yet understand all the layers ad the dependencies here, but I'm pretty sure it's safe to say that satisfying all the dependencies of swapping out the interactive desktop layer of an operating system involves quite a bit of re-wiring of what the parts are and how they fit together, and doing it all cleanly enough so that you can switch back and forth. Now, I get: A display manager is a program that provides graphical login capabilities for the X Window System. Only one display manager can manage a given X Server, but multiple display managers are installed. Please select which display manager should run by default... and a bit about why that isn't true if you edit etc/init.d. Ugh, okay... I have to choose between lightdm and sddm... I chose SDDM... the opposite of the Ubuntu choice. I already use Ubuntu, so let me try the different thing... Simple Desktop Display Manager... the Wikipedia page says: > KDE chose SDDM to be the successor of the KDE Display Manager for KDE Plasma 5. Wow, was that a big install! This had better work. Ugh! No space left on device. Crap. Uninstall all that crap, and be ready to do another fresh virtual machine later. This is the rabbit. -------------------------------------------------------------------------------- ## Thu May 12 22:44:54 EDT 2016 ### IPython is now Jupyter? Either way, I've got Notebooks working locally. I'm actually now also doing: pip install jupyter This is going to be interesting. I want to run it on my own machine. I'm probably nots for doing it on the native software installation on the hardware instead of through some virtual machine, like I do at the office. Truth be told, the increased complexity nonsense is rarely worth it when it just works fine natively with so much less overhead of every type, and when you do need isolation, there's virtualenv. I'll eventually be using that soon, but I haven't had anything like the version conflict problems that I've heard about. I think I maybe encountered one, and had to use a modifier on a pip install command version once to get the right version of a thing. Anyhoo, I'm angling to get IPython, now known as Jupyter Notebooks (IPython is so much easier to remember and type) running in server mode, so I can do all that loading and saving and charting and crazy Python tutorials, take it to the next level stuff. Wow, okay, I got Jupyter running locally on my Mac, on localhost 8888... wow. Nice stuff. This may very well be... ahhh. Nah, wait. This could be something I use to help me learn Python, right along with Pythontutor, and PyCharm and any of these other tools that makes life all so wonderful and Pythonic. But really, at the end of the day, you should be using vim and get along with just a modified .vimrc, without even excessively heavy plugin dependencies, because you're going to be switching around machines a lot. And so... and so, my upcoming talking head videos will probably have a Terminal and vim, and now tmux, feel to them. IPython is nice, but a text editor like muscle memory is nicer. -------------------------------------------------------------------------------- ## Thu May 12 22:19:29 EDT 2016 ### Alice and Bob Intro I feel myself gradually easing into a new phase of all this. I feel the need to copy-paste these phone-notes about the new book before I forget: Alice & Bob Login There's two parties who need to do some work. One is named Alice, and the other is named Bob. On occasion, we will call them A or B for short. A and B work together to produce C. B provides input to A, who plugs a return value into C. This can me expressed thus: C = A (B) B is the customer. B has a business need and is coming to a with a question. B works for a "the man" and has to figure out hot to make more money by selling less product. The answer, Bob believes, is to sell more to your best customers, and to get rid of the worst, some of whom cost you money to keep as customers. Bob can calculate this for each customer, given the cost of having them as a customer (input parameter A) minus the amount of money you made from them in the same time period (input parameter B). In this relationship, Alice is the Admin She is superuser. She is root. - Bob is the Business User - Alice only types in a terminal - Bob only browses in a browser - Alice and Bob work well together to get stuff done. - They both work through Google services. - Bob only knows spreadsheets, and really doesn't want to learn anything more. It's not an aspiration of his to become more technical or learn how to program. Still, he is comfortable in Excel, and knows his way around a formula. But spreadsheets can only take you so far. - Say you want to look up the title tag for every url in a pretty long list. Let's say 2000. - It's easy enough for Bob to enter 2000 rows into Google Sheets through copy/paste or import - But now he has to share Alice, his tech, into this sheet and explain to her exactly what it is he wants to look up for each row and how - He could accompany the sheet with a note explaining it, and trying to put it in context to the provided sheet. Or as a much better alternative, the request could be made by agreeing upon a few conventional standards to be adhered to when submitting the request sheet. - Request sheets can be thought of as CSV files, as they most often likely will be. - Nothing further than the CSV need be provided to define the job request. Just one table within a text file. - So, how do we define a request with a table? Easy, there are two methods: the first is explicit and the second is implied. Explicit vs Implied Mode Many of us 9 to 5 office drone workers are used to thinking of functions as this invisible code somehow embedded into spreadsheet formulas of xlsx files. You lose those formulas when you export the sheet as that universally interchangeable data format simpler and far older than either JSON or XML. So simple CSV text files, yet so opinionated, nuanced and potentially powerful, they are. Make the files utf-8 encoded, and you have a lot of foreign character and character encodings possible. And lots of data, definitely very grid-like and tabular in nature. But NOT a lot of nested data relationships here -- unless it's field-stuffed, with all the escaping and encoding and neutralization nonsense that accompanies such attempts. I intend to take advantage of the universality of CSV files, coupled with how well both Python and JavaScript can support CSV file data interchange as a sort of subset of JSON or the Python list, dict and tuple data structures. In plain english, I'll be tapping into some unique strengths of both JavaScript and Python to zap CSV files around, that serve as both unprocessed job-requests and completed job-replies. Neat, huh? I'm pretty pumped about this, and am already thinking a generation or two beyond this, for once the foundational core programming of a an algorithm playback reference specification. Ladies and gentlemen, I present to you, the loom. Or maybe the player piano... of easily written-by-non-techs job-request music... as CSV files. Or as Google Sheets. Or who knows how else. Point is, there's an intermediary mapping layer that's simple and tabular and stacked-list-like in nature. And they run, and transform a copy of themselves. Less simply put, a Turmite is about to infect your mental coding. Here is your escape sequence. Listen carefully: Clear your mind right now. Forget you ever saw or heard of me. Hit RESET! RESET! (if you don't want to get caught up in a recursive language hack code meme that I'm hoping maybe would make Dawkins and Stephenson proud. This CSV mentality is superimposed on top of similar Python data structures and persistence mechanisms that mind data to gigantic Python dicts data-bound to back end databases What I propose here with Pipulate is way simpler than that, and can be done by any general programming language runner. I use Python and a nearly memory-less back-end that takes data in, interprets and carries out the request, and returns the results in either Pythonically generated streaming, chunky installments -- or as just one big reply-chunk. The choice will be yours, based on what your interface layer consists of. Pipulate will provide choices of ways to interact with it. For some, it will be drag-copying CSV files into a folder, and double-clicking a pipulate.desktop (or some such equally cheesy Platypus-ish bundle, .bat or other double-click way of running a script) to turn the submitted request into a completed job, copied to a parallel directory structure if the job was submitted by command-line. If however the job request was submitted as a mime attached CSV file to a request submitted to a Pipulate server, Pipulate's action will be to streami stream the reply back, possibly in paginated chunks. In this way, Pipulate is itself an API, and its own working reference specification to the standard, like Netscape once was to the HTML standard. Pipulate job requests can be formulated as the POST-method equivalent of the following: such http://pipulate.mikelevinseo.com/apiendpoint?[('param1', 'param2', 'funcname'), ('input1A','input1B', '?'),'input2A','input2B', '?')] The result will come back as the same shape, with the question-marks replaced by the return value of the server-side execution of funcname('input1A','input1B') and funcname('input2A','input2B') to produce: [('param1', 'param2', 'funcname'), ('input1A','input1B', 'spam'),'input2A','input2B', 'eggs')] ...presuming the return values of spam and eggs from the respective functions. I will define the pipulation process here, and other better optimized than my working reference specs. I have found that my original vision for Pipulate was just beyond my reach, when last I tackled this project. The project will remain extremely lightweight in nature, with its target execution platform being a 512MB of working RAM space for running programs, and another maybe the gigs for OS, Python, packages, and extremely temporary local data caching. Pipulate does not provide the storage layer. It takes requests and returns results like a headless webserver. You will not find Pandas or even numpy being imported here. You will find Flask though, for its web-service mode. I'm sure I'll support a bunch of optional JavaScript libraries to visualize the results without Google Sheets always in the picture. Maybe, submit a job through a web interface CSV file upload, and get back a beautifully rendered visualization using d3js or some such. But no heavy components by default requirements. What you do with your server build configuration in the way of pre-installed packages on the Pipulate player nodes is up to you. -------------------------------------------------------------------------------- ## Thu May 12 22:14:18 EDT 2016 ### I did my first IPython install I just installed IPython on my home Mac just now, with: pip install --upgrade pip pip install IPython I didn't run it in server mode, but I can absolutely get to IPython now from the OS X terminal, simply by typing IPython. I wonder if that gives you notebooks and everything, or if that loading and saving interface with checkpoints is all part of the server way of running it only. So much to learn, and continually hard to believe the sophistication-levels of these tools. This is all well, well beyond Amiga days... there you go! I worked Amiga in again. -------------------------------------------------------------------------------- ## Thu May 12 20:53:39 EDT 2016 ## A few notes about tmux and other abacusantikytheramagigamacallits Well, this is an interesting experience. This is my first time using tmux on my home iMac 24-inch screen, with the tmux window sub-dividing (into panes) feature splitting that 24-inch screen into an incredibly more sane size than the full-screen, 80-column comically big that I was using. I had to turn my head left-to-right like watching a train pass by as I typed, to read my own words. But now, it's comfy AND still in that nifty new OS X full-screen mode that I love so much. Sorry everybody, but Macs really are just the best Unix-like host system... so much so that they actually ARE genuine Unix, when compared to any version of Linux ever. Only BSD can stand up to the credibility of Apple in this ultimately geeky CompSci-history debate. Unix is an ivory tower of GNU'sers, such as it were, and Linux is not a member of that hotsy totsy club, consisting of Ken, RMS, and precious few others who are still around. SCO lost, and BSD won... a few years after Linus and the FOSS community won with Linux first. In one of the ultimate monolitic kernel versus the HURD humor that only dinosaurs can appreciate, Linus refuses GNU GPL Tivo 3. Linux is not GNU3 is not Unix... is not Multics... is not whatever else whas in some damn cobbled together system that Multics was imagined from, and so on back to Eniac, Charles Babbage's Difference Engine, and probably some Chinese or Greek abacusantikytheramagig. Who knows, maybe Raptoroids were pouring water through eggshells to calculate how to tisembowl their prey more efficiently, and back to tmux. Gotta get down these commands: - Ctrl-b c for a new "copy" - After that, Ctrl-[0-9] to jump to the other console by index. Neat! - Always make sure "set -g mouse" is in ~/.tmux.conf in order to stay sane. - Mice or trackpads or touchscreens or whatever are now a part of life. - To try to pretend there shouldn't be an easier way to switch and resize panes than to move a mouse, grab a border, and slide, is to be silly. - Finger convolutions for common sequences is unkind to the user. In this way, vim is still a superior editor to emacs, even though from a completely MIT-objective standpoint, emac is superior. You could write vim in emacs, emacs is so powerful, and in fact, they did. It's called evil mode. - I need to get better at creating sessions, swtiching between them, detatching them, coming back to them later, and the such. I don't know how I lived without htis for so many years. Go figure. Lesson learned: good lessons are hard-learned, sometimes over many-a-year, and many-a-didn't-get-the-point mistake. So back off, will ya! Get your own blog and your own channel. Oh, you did and no one watches or cares? How sad for you. And due to projection. that can of course be deduced to be what I fear most in myself... that I am some sort of clown. What kind of clown do you think I am. Does it AMUSE you that some like to type. I know you don't repeat yourself. But we enjoy typing. You're DRY and I'm WET. But can't we all just get along? I repeat myself, can't we all just get along? And those are a few things to remember about tmux. -------------------------------------------------------------------------------- ## Thu May 12 17:33:00 EDT 2016 ### My Plan to Leverage the Private Repos I've been heads-down working on 2 projects recently, which have both kept me from more exploration, but also have been churning up the desire and need to do more exploration. I am going to go home at a sane time tonight, and think about these projects in a series. The timing is interesting with unlimited private github repositories now possible under the 1st-tier paid level of service. Hmmmm. Several projects secret-side, which I mimic public-side, doing it the way I would have done better the second time, but coming off like the first time. All issues fresh on my mind... yes. -------------------------------------------------------------------------------- ## Thu May 12 08:42:15 EDT 2016 ### New Workflow Starts to Take Form At work, running the reports. Feels pretty friggin good. Don't forget to run unique.py against the result. I have it using the same cfg.py for paths, so that's good -- a step towards true integration. It can't be a follow-on post processing step in the real automation process. Now that I'm settling into tmux, Ubuntu Unity has even less to offer, aside from the Debian repo system. I mean, I like Ubuntu Unity and all, but I'm no longer thinking that I LOVE it. And there's a lot of talk about KDE Plasma lately, and how totally refined and well thought out and beautiful it is. Starting to do my research into my next Linux virtual machine. Oh, and I just added this line to a new ~/.tmux.config file (the beginning of a long relationship?): set -g mouse ...and now, not only can I select the tmux window panes with my mouse, but I can also resize them by clicking-and-dragging on their borders... wow! Tmux, where have you been all my life? Oh yeah, mostly there, with me ignoring you. Well, thank-you whoever chimed in and said that it should be one of the honorary remembers of my extremely abbreviated short stack of Linux, Python, vim and git. Since I started sharing my tmux experience on the social medias, one of my favorite people, Jeff Scott Ward, chimed in with a tool showing how tmux can be turned into a web collaboration tool through browsers, using https://tmate.io/ Now, onto the menus. Do a few Hail Mary passes to flesh out the menus through the pivot table trick. Do what you know is going to make good looking menus, instead of just allowing it to circumstantially "emerge", which is resulting in some pretty awful hierarchal structures. TALK THROUGH the menus and the strategies... and particularly, the filters that help you do your work. -------------------------------------------------------------------------------- ## Wed May 11 22:00:05 EDT 2016 ### Unlimited Github Private Repos in Gitub $7/mo Tier-1 Github was really smart to do this first. Woo-hoo, Github! I'm glad to be typing into a text file that I will commit and push to a repository that resides on you, and is published to mikelevinseo.com through the github.io publishing system. I am so comfortable with this way of working now, I would absolutely call it my preferred form of blogging. And why not shove it all onto one apex homepage url, on which the entire website resides, in this day-and-age. What use are the sub-pages, but for more precise, topically-divided units, upon which it is easier to attach hypertext anchors... oh yeah... that's called the Web. Oh well, another thing I didn't invent. So, this one long page experiment is nice, but don't stop chopping up your site into the correct handle-able chunks... that-is, chunks upon which you're going to slap a path and filename as part of the handle, called the URL. All around, good system. This book is for people who would like to become a whole lot more technical, perchance to do more incredible things in your life, by virtue of having the knowing and the doing of things. Yeah, that's a good plan. Where do we begin? <pre> Tools, tools, everything's a tool Letting you shape anything so that it's cool Like this, or like that, 'cause you are no fool </pre> Perhaps carpentry is the closest analogy. Or people like to draw the parallels to Kung Fu or Samurai. We keep our blades whetted and ready. Some of those blades are nano-precise, through an atomically precise fabrication technique known as folding, such as used with samurai swords. And in such ways, miraculous properties of matter are available to us through relatively simple formulas and matter-hacking techniques. Using semi-conductivity to make those tiny little solid-state switches we call transistors is one of those matter-hacks. That particular matter hack is the pivotal one that enables the digital age. Not all electronics is digital, and not everything digital is electronic. Uh yeah, something like that, but edited for a little more mainstream palatability might be an intro to my book. But what does it SAY? Not exactly an executive summary. ### BOOKS O' MIKE LEV.IN/UX - Book One: A Matter of Hacked - Book Two: Tardigrade Circus - Book Three: Metcalfe's Merkle Tree -------------------------------------------------------------------------------- ## Wed May 11 12:01:11 EDT 2016 ### Menus Okay, now... FOCUS!!! Currently listening to Podcast.__init__ ...and if that isn't a Python inside joke podcast name, I don't know what is! One of the few internals that's common practice to address directly, I believe. Gotta investigate that too. I don't make as much use of __init__ as I should. Now, is that really focus? Wed May 11 12:40:06 EDT 2016 Listening about Python SALTSTACK... a configuration management system... but really more of a generic management system. There are many layers of salt that are pluggable. So, what most people see as the primary salt commands are the execution management system. But there's also a state-layer, and they are two systems that work together. Cool large-scale deployment control systems. Python SALT is one of the viable magic wands in Sorcerer's Apprentice scenarios. There's so many subsystems, like reactors and events and such. Kidna profound... that's confusing, and there's a lot of bits I didn't know about... hahaha! This is way beyond simple boto scripts. This is about deep, long-term health of systems. Look at the beacons system too. Focus? Really, is that focus? Oh, but it's all so related. I will benefit from this distraction soon enough. Chief and Puppet... look at those vs Salt. Deploying Canary deployments with 10 lines of code... or within minutes. Surprising simplicity. Wed May 11 14:40:56 EDT 2016 Ugh! The "Combined" tab appears to have gotten mangled. This is a potential setback. Wed May 11 18:27:01 EDT 2016 Great talk with Rupali. Heading home. -------------------------------------------------------------------------------- ## Wed May 11 08:16:54 EDT 2016 ### Final Sprint on Two Projects (and tmux in use) I'm back in the journal again... I'm back in the journal again... This is quickly becoming as real of a space as anywhere else I live. I got here quite early this morning for the menu project. But got the Pulse script running. I like the concept of a pulse check versus a health check. It's just taking the pulse of the sites... but in an incredibly useful and directionally meaningful way... a way that helps you formulate concrete specific actions around it. But there are nuances to work out in the process -- many nuances. I have solved problems separately, but now I have to merge them together into a single solution. The latest of which is ensuring unique, individual title tags. I also have some file renaming and copying to do so as to not have to do it manually (past today). But this report automation stuff is not why I'm here early. But I did want to get my solution running that I coded from home this morning, regarding finding the last day that Search Console actually has data for, before running the Keywords Day column... done! Seems to be working. Now, onto menus! Hurry, hurry, hurry! Wed May 11 09:57:20 EDT 2016 Okay, the menu selection process really is a delight -- nothing to dread at all. I am effectively shaping the future of the site. But before driving that ultimately home, I need to do the final processing of the daily scripts. Make it a bit cleaner than last time. Wed May 11 10:23:38 EDT 2016 My first day with tmux... wow. Living right at the edge of muscle memory. Wed May 11 11:55:42 EDT 2016 Okay, a productive morning in tmux, and with the whole report.py process. Time to focus laser-like on Menus until done. -------------------------------------------------------------------------------- ## Tue May 10 22:38:08 EDT 2016 ### Figuring out how to get the last collected day of Search Console data Just did this test... alist = [0] * 10 alist += [1] def recurme(alist): print(alist.pop(0)) if alist: return recurme(alist) else: return 'gotcha' print(recurme(alist)) ...which is quite hopeful, for all those times I want to use recursion to get the right number, but I'm thinking in the case I'm dealing with, it may be just to just iterate from 2 to 5 to get the last available day of collected search console data. Problem is, it can come up empty up to 4 days in the past. So, basically just doing this is better than the complexity of recursion in this case: >>> for i in range(2, 6): ... print i -------------------------------------------------------------------------------- ## Tue May 10 21:57:53 EDT 2016 ### tmux is a little like my old Amiga keyboard hardware macro device Okay, learn tmux. First, this... https://www.youtube.com/watch?v=BHhA_ZKjyxo Absorb the fact that tmux is a service and not a command. When you run tmux, you are running the tmux server. If you create a session in the server, then exit the tmux... uh... "interface" you're in, the session continues. If you log back into that server later, say through ssh from home, then you can see that session still running, and re-attach to it, just like you got up and walked away from the machine for a moment, and then came right back. Difference being, you're actually in two entirely different locations. tmux was the bridge. It is like a place. And indeed, it can be used for terminal sharing (ye ol' screen sharing) for collaboration, pair programming and such. Wow. And Ctrl+b... and some other command, like "c" for create a new window, is how tmux works. Weird... and wonderful... and like hardware macros from the Amiga days. tmux is providing you with a sort of hardware macro that's accessed by the input device capturing the ctrl+b signal, and then re-routing I/O to the hardware macro device... to say, record a macro for later playback, triggered after a corresponding ctrl+b signal, but coupled with hardware macro playback instructions, rather than recording. Yeah, tmux is a lot like my old Amiga hardware macro device. -------------------------------------------------------------------------------- ## Tue May 10 21:37:24 EDT 2016 ### Linux, Python, vim and git... and tmux. Yep, and tmux. Time to learn tmux. Gone long enough without it. I feel that missing hole in my super-powers. Many signs lead to tmux. Many signs also lead to emacs, but that'll be a mountain to climb another day. For today, my mission is to start getting tmux under my belt. It's a crime that I haven't so far. I forget who it was, but somewhere in my YouTubing past, some lady or gentleman suggested to me that I add tmux to my very short stack of development and execution environment tool. How little do you really need to be in business, sitting down somewhere and getting ready to get to work? This is the question I tackled when settling on my personal stack as Linux, Python, vim and git. This simple set of four things still leaves so much unsaid, and I'm tempted to fill in all the details, such as versions of Linux. I prefer stripped-down, no desktop, hardened and easily rebuilt primal nodes of computer power, most often thought of as embedded. Why? I don't know... I think because of something having to do with my original experiences with the Amiga Computer, which I somehow manage to mention in almost every blog post and YouTube video I make. It's true. The idea of one person being able to pick it apart and understand every bit, in a "how do we rebuild one of these things?" sense. And once built, how cool can it be, and for how cheap? Both in a monetary and material resources sense, and also in the amount of power it consumes when both active and passive. The thought that we are almost beyond the von neumann and harvard architectures into whatever that IBM SyNAPSE thing that's a neural net in hardware, so that there is no artificial bottleneck by software having to simulate a neural net on strangely disassociated components. Where was I? Oh yeah... Linux, Python, vim and git... and tmux. Yep, and tmux. -------------------------------------------------------------------------------- ## Tue May 10 21:26:43 EDT 2016 ### Ode To Mac OS X Full-Screen Apps and Virtual Desktop Swooshing It is truly odd to have multiple full-screen Unix shells under OS X on a Mac. The 3-finger swoosh on either a trackpad or magicmouse or option+left/right arrow keys to move left or right in the ribbon of virtual screens. To have ever thought such a sublime user interface would be gradually coming into existence on a Mac would have been silly. Surely this sort of ACTUAL, full-screen-biased, and indeed quite Amiga Computer-like behavior, could not come from Apple. Yet, here it is asserting its dominance over Windows 10, which I use regularly and Ubuntu, which I once again am using regularly too. In the battle of OS supremacy, from it's Unix-like foundations to its clever user interactions with virtual screens and full-screen applications, to its stunningly smooth hosting of other OSes using virtual machines like VMWare Fusion, to sessions having you having to do very unixy type stuff, like pulling down some important app from a free and open source repository, to running an app in full-screen, which happens to be a PC virtualizer product and... well, you have a little bit of semblance of my day. I wish I was on a Mac for my day-to-day... but as it is, I am on Windows 7. I keep a personal extra Macbook Air onhand at all times, and you'd be amazed how much I just end up using it instead - like for my journal entries there, because it's just nicer... right down to everything about the keyboard, except for their names and functions, which is still a stupid idiosyncrasy of the Mac. As nice as Macs are, they aren't perfect /Users/. -------------------------------------------------------------------------------- ## Tue May 10 10:27:25 EDT 2016 ### Watching Scripts Run Okay, babysit ONLY ONE PROPERTY all the way through during a report.py run, and then move-on! You have zero margin for error now with this menu work. Tue May 10 12:41:29 EDT 2016 It took me much longer than anticipated to get a fully good run. It is already coming up on 2:00 PM, and I did have to baby-site the processing to progressively eliminate this issue or that, until I got a full good run for refresh. Still need to do some file-renaming and moving into place on Dropbox (not to mention archiving the old stuff) before I can consider this a cron-ready job. There may also be some error checking to make sure all the files that are expected to be generated actually got generated, so that a re-run is given a chance to occur. But that is all for a later time. Tue May 10 12:57:31 EDT 2016 Ugh, the duplicate title tag issue. Forgot about that. -------------------------------------------------------------------------------- ## Tue May 10 09:27:55 EDT 2016 ### Thinking Through Pipuate Next Steps, Given New Work-flow Learnings I love it when I remember to commit at home and work. Each day is a work of art in its own right, and also part of a larger artwork as a whole. Last night before I left, I combined two pieces of art into one (arguably, three) by putting the title-fetching and local-caching work as a regular part of report.py, which also by this time also has end-of-process CSV-sorting as part of its normal process as well -- although that relies on such a heavyweight pandas (and numpy) component for such a menial task. I have some architectural thinking to do. This project was the perfect one to resurface all the issues swirling around near the birth of the Pipulate project, when it was still called GroPY, as a way of groping around for the data you want. I'm much happier with Pipulate for self-evident reasons, and just as with the evolution of the name, so to goes the evolution of the project criteria. What itch precisely is it that I'm scratching with Pipulate, and with what Pipulate is imminently going to become? Easy! It's running large-ish jobs of arbitrary-ish complexity. It can't be enormous jobs of enormous complexity, as there are many other tools that scratch that itch (usually at a steep monetary or learning curve price). My itch is going to be for people of introductory-to-moderate technical capacity, willing to follow a few instructions to learn conventions. To use Pipulate, they will have to do nothing more than work through a web browser to either a self-hosted (Levinux or otherwise) virtual server on localhost:8888, or a native install on localhost, or a hostname.local zeroconf avahi multcast DNS style address on a server on your lan (Raspberry Pi, etc.) OR hosted on a cloud or otherwise server with a registered domain. Any of these need to work, but in all cases, the servers are to be deemed disposable. I can't make Pandas a dependency of the new system. But also, I can't make it so reliant on Google Docs, even just for dumping off the row-by-row data. There needs to be temporary local persistence. This re-working of Pipulate has to be completely and totally secondary to getting your work done at your day-job. The refined Pipulate simply has to "emerge" out of your work, but so clearly of your own design that ZD is happy to just be associated with the already-free-and-open-source project as it evolves. That settles it -- NO new repo for a Pipulate 2 or 3. It's just going to be part of the current Pipulate repo, and will have the components (potentially the same, potentially different from the original Pipulate files) together differently. Checking on my report.py job from last night before I left, it both finished properly, and left a 2.5MB file on my hard drive from the cached titles. Woot! Double-check that the versions of the csv files with the titles filled in really have the titles filled-in. Yup, worked like a charm. Wow, oh wow! It's interesting to have to remember to do a git pull in the morning when I get to work. The last thing that I really urgently need to do now is putting the new files, properly named into place. Oh, and check your calendar. I have meetings really kicking in around 2:00 PM today. Get a heck of a lot done before 2, because once those meetings begin, you're dead for the day. So urgently above anything else, even getting coffee, get those post-processed files into place. Do the first round manually. Okay, manually put in place. Let Marat know, and then alter your script to put everything in the correct locations. Be fine with redundancy. Okay, get done getting new files to Marat. Already 10:10 AM. Get your coffee now, and focus single-mindedly now on the menus! -------------------------------------------------------------------------------- ## Mon May 9 22:19:24 EDT 2016 ### An Now a Word About Relaxing Watched TWO Person of Interest episodes tonight. Only stayed a little late at work. Still not as far along as I had hoped on all fronts. Yet still, I watched two nearly-hour TV programs on Netflix. I am so enjoying this new full-screen Mac OS X experience. Keep thinking of the Amiga. Speaking of amigas, wow, that Adi. She is really shaping up into quite a human being. The weekend a week ago, Adi tried to do one of those emotional terrorism moves, where she says the only way for her to be happy is if I let her do such-and-such. Now normally, Daddy is quite a push-over. But these are new days for me and her and us on quite a number of fronts. Homey don't play that emotional terrorism shit. I basically set her straight. Told her I knew what she was doing, and I wasn't going to fall for that shit, and she was going to be doing a lot of needless fake crying that wasn't going to get her anywhere. There were one of those really tense stubborn eye-to-eye, like she was going to force me to change my mind by stubborn force of will alone -- which is basically when I crack a big smile. Wow, people who are trying to make something big and important out of nonsense hate that moment when they realize I'm not taking their crap at all seriously. I can dish it out as well as the next person. Just because I don't do it all the time doesn't mean I don't know the game. Oh, I know the game. Anyhoo... personal journal... yeah, right. Oh, King Sammy. And Coy Boy Billy. I would be remiss if I didn't capture a bit of how Adi is indeed growing up as a sort of Jane Goodall with the cats as her chimps. Our cats, particularly the two I raised from kittens, have incredibly developed personalities. One is a self-important alpha-make covering for insecurities, who we call King Sammy. The other has an "S" mark on his back, and we call him Billy. He might also go by Teddy Cat, or the Coy Boy whose favorite game is of course go-get-billy. Billy is something of an unknown legend, being the first-ever Billicat -- by definition, the first ever, because... well... perhaps that's a story for another time. Suffice to say, Wikipedia will confirm it. I have to make a little positive forward progress every single day. -------------------------------------------------------------------------------- ## Mon May 9 12:45:09 EDT 2016 ### Integrating Title Tag Fetching into URL Report Just added a new macro to expand tabs and trim trailing spaces -- the most common clean-up I have to do to files. I put it in the .vimrc in my vim repo in github (and on Dropbox). Make sure Ubuntu is set to use the .vimrc from Dropbox. Okay, done. Ran the new macro against report.py, and it appears to do exactly what I had hoped: - Expands all existing tabs (that got in through example copy & paste) into spaces - Gets rid of all trailing whitespace Okay, that's enough polishing. Now time for the still missing pieces. Properly incorporate... oh wait! Change the order in which things run, so that complete properties finish. Change this to be property-centric! ... AFTER you get some more caffeine. Okay, now think through running each property in-turn. Mon May 9 17:12:17 EDT 2016 Got it running through properties before report-types, but it appears that the data structure being returned by the Search Console API has changed. Going to look carefully and fix it. 1, 2, 3... 1? Ugh, it was all a hiccup in the API. I have to defend against that. When the apparent "shape" of the response changes, beware! The API has not changed. You just have bad timing. But at least I have it now so that an entire property can be processed, keywords, commerce, topurls and sessions, instead of having to wait for all kewywords, all commerce, etc. It just feels like much longer of a wait now on the keywords tab, in a place where it shouldn't. Sigh, well I can optimize it later. The important thing is to... ugh! get in the title-tag work... your original work for today which you never really got to, because of what I thought was debugging... oh, and re-ordering the process... oh, and putting new ASCII art in for the properties. Hmmmm. I am so friggin distractable. Let one property process beginning-to-end, and then put in the title tags. Mon May 9 18:16:56 EDT 2016 Okay, single properties are being generated well. I could do a little file-writing IO bottleneck optimization in the keywords area, but don't worry about that for now. Just get titles into the urls tables. Okay, got it. Now, go home. Mon May 9 18:53:24 EDT 2016 -------------------------------------------------------------------------------- ## Mon May 9 10:52:17 EDT 2016 ### Natural Workflow Here For Book I feel pretty good feeling like working here is the same exact thing as working on my book. The book that I want/need to write will (should, with proper personal discipline) just take form here as I do my everyday thing. Woot! That's strangely relieving, knowing that the book isn't any particular separate effort. Just keep documenting things here, moving the noteworthy stuff into some flat (or nested) list at the bottom of the document -- the book outline. I've gone that route before... it just hasn't stuck (giant undo button?). Well, I can undo the undo (thank-you, git!). Github even lets you edit project histories (haha), so maybe I could even put the personal journal repo itself public again, after some sanitizing... maybe... haha, or maybe just don't give a shit, and let it all go public again eventually anyway, when none of that shit matters anymore, and we're both past it. There's still a lot of mutual respect there, which probably doesn't get portrayed in this journal enough (and it probably should be), but journals like this are to hash your way through problems, and not to say how wonderful everything is. Though, things are getting pretty damn wonderful, so long as I can push myself pretty hard each day to get where I need to be. -------------------------------------------------------------------------------- ## Mon May 9 09:44:15 EDT 2016 ### A Two Video Morning & VirtuaBox Keyboard Capture (again) Pushing up 2 YouTube videos this morning, and think I finally have the "framework" for writing my book finally: it shall be "extruded" off the bottom of this file. If the top of the file is reverse chronological blogging, per... well, all blogging software (but in a single text file), then the bottom of the file is the self-organizing easy-to-jump-to "tail" of the file. I shall organize my tail, while forging ahead and machete'ing my way through my thoughts and my days... in my head and at the head of this file. Yes! Mon May 9 10:37:11 EDT 2016 Just finished writing my Monday Morning Weekly SEO Report for my boss. I'm even enjoying these! It's like a mini one-page-plan every week, and I like that; it's a clarity-of-thought exercise. Keyboard capture collision avoidance between Ubuntu on VirtualBox and the Windows host isn't quite as solved as I was so excited about it being last week. The problem remains getting out of Ubuntu fullscreen, but without turning Ubuntu fullscreen off, which is very slow and disruptive of process. Instead, I need a nearly 100% reliable way of "getting out of Ubuntu" and to another virtual desktop provided under Windows 7 by VirtualWin. The best I can find so far is: - Ctrl+Home - Host (Windows) key - Shift+Alt+right-arrow (set through VirtuaWin Setup) -------------------------------------------------------------------------------- ## Mon May 9 07:41:14 EDT 2016 ### Is it Better to be a Tech Billionaire or a Happy Nobody? I have to make BETTER forward-progress every day than I normally have been. I haven't been without progress, but it hasn't been ENOUGH progress. My boss pushing me hard recently has been a good lesson to myself about how much I CAN do in how little time, if I push myself. The time is upon me when I have to push myself FOR myself. This is not an unreasonable request of myself. If I don't, I could still lose everything. There is no situation so bad that it couldn't become worse, if you don't stay on your guard, and keep pushing, pushing, pushing! Really funny, I guess my intense recent work experience couples with the re-reading of Lord Foul's Bane, and Adi getting interested in such never-give-up attitudes as expressed by Blazion in Yo Kai Watch... hahaha! Well, it's just like you don't start noticing a particular model car on the street until you're interested in it, then you see it everywhere. The human brain adapts to be comfortable and spend the least-possible energy to survive in any given situation. Only through competition and generational cycles does forward progress get made. That, plus the occasional benevolent sociopath. I'm just glad Elon Musk hasn't decided to actually become a James Bond super criminal. You change the world through changing the world -- directly, through force of will. It's a crazy thing how big a deal inner discipline can be, but I'm not totally sure what I think of it. Is it better to be a billionaire empire builder who would never live a normal life again, or a happy nobody? -------------------------------------------------------------------------------- ## Sun May 8 11:19:50 EDT 2016 ### Organizing Bedroom Mother's Day. Going to be sure to bring Adi back to Staten Island early enough, and with enough in-hand to celebrate Mother's day with Rachel. Just not going to move very fast during the day. Going slow is one of the things I miss most about my old life, that I have in great part just regained. I don't have to feel the pressure to move onto the next thing and the next and the next, without having given proper follow-up and attention to the first thing. So, today I putter and organize. Yesterday, I basically just watched Yo Kai Watch and Gravity Falls with Adi all day... bliss! But today, I need to gain a little ground... get my head about the surface... get out from behind the eight-ball, and all that. 1, 2, 3... 1? Clothes! It all starts at your closet and wardrobe. If you're not organized there, you're not organized anywhere. Do an 80/20 pass on your clothes. I have to now totally eliminate the crammed-up, jammed-up feeling. I've got space to work with now, and I can't let myself continue to try to live in a 2-inch NYC personal-space bubble, even at home, anymore. Now, I can get a desk that I can lay-out and organize some paperwork on. But even that's too much of an artificial dependency to put in the path of becoming productive and organized again. I will cobble together enough of a desk and workspace and area. Hmmm. In the master bedroom? How does this play out, exactly? Sun May 8 23:19:08 EDT 2016 Ugh, had a near big mishap with losing my ATM card (again) and being completely and totally otherwise without money, and no gas in the car. It was a crisis on how to get Adi back to Staten Island, until I realized I could transfer money to the just-activated ATM card tied to the other account. Realized far too late, and made Adi late for Mother's Day. I think I came out of it unscathed and un-hated. But it was one of the more stressful days in memorable history. I switch jobs with less stress than situations like that. It's late enough. Just go to sleep! -------------------------------------------------------------------------------- ## Sat May 7 18:36:31 EDT 2016 ### Coding Guy I have to assert with Adi that when I sit down with a laptop like this at home, it is not work, and it is not something she should exert energy trying to keep me from doing from time to time. I tell her it's my personal journals just like in Gravity Falls. I think she only half-believes me. I think she might be convinced this is what coding looks like... hahaha! Well, I am at least using vim, which should give her a little feeling for what it looks like. -------------------------------------------------------------------------------- ## Fri May 6 13:30:24 EDT 2016 ### Another Fine Week Just ate lunch on the 11th floor. Very good to do when I can afford it, for happy accidents and serendipity. Speaking of serendipity, I ran into Dave Berkowitz on the street. Gonna look to see if there's something for him in the organization here. He seems like a potential fit. Tonight will be a dinner at IchiUmi with Adi and Rachel for Adi's five-and-a-half year old birthday. Much lower key than in the past. Seems right as she's getting older. I think I made the right choice last night and today, regarding focusing on the report clean-up and automation. I actually spent a lot of my creative energies and I can feel the fatigue. I have to keep myself in good shape for dinner tonight, and bringing Adi home, but oh how I am going to want to sleep. I'm glad it looks like a rainy weekend. Screw the Catskills this weekend. I got a letter that they don't want me to even use my cabin until some final amount of accrued money owed is payed, after I sent in my last payment. Ha ha, well, I've got my plans and they are good ones with great friends in the picture and much cheaper property and fees. It amounts to moving across the street. Fri May 6 16:02:39 EDT 2016 I can't believe it's already 4:00 PM. I think I have a reverse caffeine headache... from TOO MUCH caffeine. Fri May 6 17:45:48 EDT 2016 Hmmm, I may want to adjust the report process so that it outputs a complete property including all its different types of reports, instead of outputting all of one type of report. I can also make ASCII art of the property names, which I think people will like to see when the script is running... process all report-types for a given property, then move onto the next property. And don't forget to get the title-tag work properly incorporated! -------------------------------------------------------------------------------- ## Fri May 6 10:24:10 EDT 2016 ### First Day after Third Wind Wow, back at the office. Work up at like 8:05, but still got here by 9:30 AM. Woot! Man, am I starting to feel it coming together here. I just have to remind myself to be as good of a Daddy on Friday through Sunday as I'm turning into good of a professional (again). This is my first opportunity to be a hot-shot in-house professional (not Agency-side) since Scala, really. Agencies "let" me code here and there, but it was never really what they valued most as my primary function, even considering HitTail and 360iTiger. Tiger will be (already is) relegated to the nether-regions of vague memory, while HitTail continues to live on in fairly decent glory. I'll start purging the project-name 360iTiger from my stuff, keeping only the function it played in helping secure the upper-right quadrant in the 2012 Forrester Wave report. Oh, there's supposed to be a new one of those... go check! Yep, here it is for $2495: https://www.forrester.com/report/The+Forrester+Wave+Search+Marketing+Agencies+Q1+2016/-/E-RES122874 by Collin Colburn with Shar VanBoskirk, Emily Miller, Wei-ming Egelman, Laura Glazer The description reads: > In our 23-criteria evaluation of search marketing agency providers, we > identified the 10 most significant ones — 360i, Acronym, Catalyst, Ethology, > iCrossing iProspect, Merkle, Performics, Rise Interactive, and Zog Digital — > and researched, analyzed, and scored them. This report shows how each > provider measures up and helps B2C marketing professionals make the right > search agency choice. Wow, I've worked for the first two significant marketing agencies mentioned: - 360i - Acronym Too bad I wasn't responsible for the proprietary in-house tech at Acronym... they had their own already before I arrived. I wonder what 360i is touting as their own -- if that's even still a criteria now, 5-years later with the rise of the cloud and everything. Sensibilities regarding best practices and wisdom may have changed. For example, the really valuable "brains" is stuff that's all running on other people's brawn. Will be interesting in reading it eventually. Maybe that grid is available... oh, go check the 360i website... oop! Here's their new proprietary tech: http://blog.360i.com/search-marketing/360i-breaks-new-analytical-ground-proprietary-media-platform-pulse > In the program we showcased our proprietary media platform Pulse which uses > advanced natural language processing and rocket science algorithms to bid in > the paid search auction and decide how much to pay, if anything, for any of > the six billion searches that take place each day. The platform is breaking > new ground in the use of innovative mathematical techniques and to-date has > resulted in great cost savings and revenue generation for our paid search > clients. Kevin Geraghty stuff. Well, good for him! One of my favorite people in the whole world. Just sent an email to congratulate him. Gee, I hope he's behind it. Anyhoo... Okay... but today. Think! You have to get the "menu selections" made very soon, but also there's some of this report automation stuff to make sure everything keeps kicking off smoothly every day. Even though I'll keep generating these CSVs (and eventually, turn back on the updating of the Google Spreadsheets), I would do the project an enormous amount of good by looking at the preferred data sources for Tableau. Maybe I should just publish a local database connection to something like MySQL or even SQLlite. Hmmm, time to do local PostgreSQL installs? My own personal instance of PostgreSQL until I get write access to something somewhere on Amazon Redshift. Let's go do a little quick research. I have my Windows laptop locked into Ubuntu fullscreen-mode under VirtualBox. It's quite nice, but I need a better way to get back to Windows on occasion than right-control-key+F. Sheesh, it goes through the heavyweight rigmarole of taking Ubuntu out of fullscreen, instead of just doing some sort of viewport shift. Ugh! Wow, Mac full-screen under El Capitan is really off the hook lovin it. Wow, even SimpleNote shifted to full-screen, and one Unix terminal for this journal, and one for the web browser... the Mac is more Amiga-elegant with every OS upgrade. Maybe in another 20 years, they'll get all the way there. Solve the little-things when they bug you, unless it requires chasing a rabbit. That deserves to be at the top of this journal. Okay, make a new section for wisdom that just occurred to me. I'll start to collect my one-liners there. Okay, I'm re-installing the VirtualBox Windows guest additions. I figured out how to get to the "CD-ROM" VBOXADDITIONS_5.0.20_106931 and double-click the autorun.sh. Just finished the recommended reboot... Google Chrome still has a rendering redraw problem that basically makes it useless on Ubuntu 16.04 under VirtualBox. I am running the 3D acceleration, but without it, it's sluggish to the point of unusable (in a modern sense). On a positive note, Windows is capturing the Ctrl+Alt+Arrow-keys to do VirtuaWin virtual window switching, so I'm not quite so "locked into" Ubuntu when full-screen mode. The way that is working (what's capturing that key combo) just changed. If I change one or the other's key combo, I may be able to virtual screen switch (workspaces) in EITHER Windows or Ubuntu... and that would be cool. Wow, okay done! So, it's: - Ctrl+Alt+Shift+Left/Right - Virtual-screen-shift Windows under VirtuaWin - Ctrl+Alt+Left/Right - Virtual-screen-shift Workspaces under Ubuntu So, basically I just drop my finger on or off of the Shift key to switch which sort of virtual screen switching I'm using... unbelievably cool and usable. OMG... the second Ubuntu physical screen can be moved independently with VirtuaWin using the "move here" option, making it possible to have Ubuntu on one screen and Windows on the other, and them have them switch when you switch viewports from the Windows virtual screen switching perspective. Wow, my head hurts even just thinking of the convolutions possible. Avoid that. You're either ALL Windows or ALL Ubuntu. Wow, and this keyboard shortcut collision prevention also fixes the keyboard capture issues when "passing over" an Ubuntu screen while switching, making easy navigation between the two worlds not only possible, but somewhat of a magical pleasure. Stick with this. ALWAYS have Ubuntu running. Dedicate 3 full screens OMG... the second Ubuntu physical screen can be moved independently with VirtuaWin using the "move here" option, making it possible to have Ubuntu on one screen and Windows on the other, and them have them switch when you switch viewports from the Windows virtual screen switching perspective. Wow, my head hurts even just thinking of the convolutions possible. Avoid that. You're either ALL Windows or ALL Ubuntu. Wow, and this keyboard shortcut collision prevention also fixes the keyboard capture issues when "passing over" an Ubuntu screen while switching, making easy navigation between the two worlds not only possible, but somewhat of a magical pleasure. Stick with this. ALWAYS have Ubuntu running. Dedicate 3 full screens to each. The fourth windows screen is the 3 Ubuntu screens! Wow. Fri May 6 12:35:26 EDT 2016 Okay, my boss showed one of the primary stakeholders for the reports the first version of the reports. Very happy. Same stakeholder as for the menus, and it buys me a bit of time. Now, go eat. Keep in mind you'll be meeting with Adam Whippy in all likelihood sometime today -- another one of my favorite people from my past. See if you can't keep connecting dots! -------------------------------------------------------------------------------- ## Thu May 5 20:44:24 EDT 2016 ### Don't Forget Python Shelves for Local Persistent Lightweight Databases Still at the office. These reports are a big thing, and they do sort of change everything moving forward. I like how my boss is a big impact player, and he's giving just as much as I am. I however am at an interesting phase of my life and my career, at which time with just a little bit of good, strong energy, I can complete that transition to force-of-nature which started and aborted and started and aborted many times over the years... family, Commodore & Amiga, Scala, Connors & HitTail, 360i, and now... now... yup. Time to make solid contact with the ball. It has a lot to do with generic Linux, Python, vim and git. AND I basically didn't even sleep last night. I got a couple of hours, to be honest, but nothing like what I needed, and I'm well past my second wind, and into my 3rd, and still I'm staying here for da' boss. I'm hanging around in order to look at the other properties that the report will show. My counterpart in San Francisco is working on those. I did the data-pull parts of the project. I may run the data pulls again (over and over) tonight, fine-tuning things. Thu May 5 22:35:03 EDT 2016 Still at the office. This is a testament to what a little bit of pressure can do, and how these little hyper-focused inspired bits of time can make 1000x the difference of typical... phone-call from boss. Says go home. Will have some overlap time to talk to counterpart this way, as opposed to much later and completely non-functional. And the title-fetching is re-running, but this time with a local Python shelve database, relieving me of the biggest time-requirement on future running of this data-gathering process... woot! -------------------------------------------------------------------------------- ## Thu May 5 13:13:49 EDT 2016 ### Fetching Title Tags with Python From and Into CSV File I'm running all the titles now... pshwew! It's all about dealing with all those strange little edges, like unexpected encoding and line returns inside title tags and empty title tags. Defense, defense, defense! IMMEDIATELY switch your thought-process back to the menu project! After the menus, you can set this up for full, scheduled automation. Figure out this CSV issue. Apparently, Tableau needs Microsoft JET in order to connect to CSV files as a schedule-to-update datasource. Terminology varies surrounding this, but basically the documentation says it's doable, but it's not something that's known or considered to be mainstream or easy or without ridiculously prohibitive complications. Most people who discuss it say: just put it in a database of some sort. Thu May 5 14:43:55 EDT 2016 Okay, it's taking longer to process the titles than I would have liked. I will have to look at a way to speed it up for next run. This is by far the most time-consuming part of this process. All done. Okay, Close & Save State of Ubuntu, and go back to Windows for a bit. Yay! Not really. But once you have Chrome running and a MinTTY shell, it's not too bad. My head is spinning... not sure if I can even be productive. Took a walk outside and got some lunch, but my second wind is wearing off, I think. Muster your strength now, and power through this Menu work. Get into the zone. -------------------------------------------------------------------------------- ## Thu May 5 09:34:41 EDT 2016 ### Python CSV File Processing For Fetching Title Tags Okay, at the office. Glad I'm documenting all this. Extend your estimates dramatically for any work you need to do, at least x4. If you think it will take a day, say a week. If you think it will take a week, say a month. This is because of the unknown factors. What, after all these complex data-pulls and joins, you want the end result sorted? Okay... pull an all-nighter. Not a young man anymore and can't keep working in that mode. But the high of this type of work is still quite delightful. To be challenged is to be human. The quality of the human you are can be at least in-part gleaned by how you meet challenges. Anger and frustration, versus... I guess in programmer's terms, the smartest lazy way to do a thing. And in this case of filling in title tags against a bunch of csv files, I think the built-in Python csv package will more than suffice, and I actually started to program it on my Microsoft Surface on the subway on the way in today: # Open a source CSV and for every line, write into destination CSV. with open('source.csv', 'rb') as f: reader = csv.reader(f) with open('destination.csv', 'wb') as f: writer = csv.writer(f) for row in reader: row[0] = scraper(row[0], "//title/text()") writer.writerow(rowlist) print row I was tempted to make this integrated to the report generating process, but then I came to my senses. Get it working stand-alone first. Get the repo onto your (now main?) Ubuntu platform. Pandas is not involved in this latest title work, but it is really important to get it onto Ubuntu, and the way to do it is: sudo apt-get install python-pandas Thu May 5 10:37:31 EDT 2016 Oops, accidentally blew away the long-time-to-generate from SQL csv files. Oh well, re-generating. I could get them back from Dropbox recovery, but why bother? I have to test the Pandas stuff from here at the office now that I have it installed -- basically see the SQL plus CSV-sorting work all working together. I will however be careful to not make that mistake again. Okay, I have re-established the sortcsv program. That private repository limit on Github was really a pain today. I think I may look at bitbucket for stuff like this. As soon as the SQL finishes running again, save off a backup of the csv directory, and then run the sorter against all files in the master directory. And then wire your title-tag fetching work against a non-dropbox location to keep what Marat needs intact right up until the second I have titles filled in. Okay, just 3 more sql items running. Go get another coffee and take a potty break. Done. Feel really good about yourself. This is the culmination of a lot of stuff. No matter how stressed you feel, it is NOT by being up against tasks that are bigger than you. The tasks are indeed perfectly suited for you. The entire issue, not that it's even really a bad issue -- unless I actually do drop dead from a heart attack or something -- is how unnaturally fast (for me) I'm trying to get it all done. But even a big chunk of that is probably truly time management. Procrastination. Not extreme in this case, but it is definitely there. I need the chipping away... the chisel-strike strategy... more than ever. Okay... while this last sql statement executes, get prepared for that tile stuff. AND DON'T FORGET TO BACK UP THE DROPBOX FOLDER! Okay, done and done. 1, 2, 3... 1? Titles! Fast now! -------------------------------------------------------------------------------- ## Thu May 5 07:27:53 EDT 2016 ### Maximum Number of Private Repositories Reached on My Github Before actually leaving for work, see if you can't actually hit this home very easily. Hmmm, I'm changing my mind again. I'm not going to mix this into report.py right off the bat. This will be a new private repo again. Make a private repo named titles. Oops, reached private repo limit. Delete sortcsvs, as it's now incorporated into the reports repo (also private). Okay, titles repo made. Okay, and used the still-local copy of sortcsvs as the starting point for titles. I could have just renamed the repo. Live and learn. 7:45 AM... not so early for work, after all. Okay, this is a good starting AND stopping point. Get freshened up and get into work. -------------------------------------------------------------------------------- ## Thu May 5 04:17:21 EDT 2016 ### Pandas Sorting CSV Files And now for the pandas sorting tricks: - http://stackoverflow.com/questions/17870476/sorting-rows-in-csv-file-using-python-pandas - http://stackoverflow.com/questions/15559812/sorting-by-specific-column-data-using-csv-in-python import pandas as pd df = pd.read_csv('Full_List.csv') df = df.sort('red label') df.to_csv('Full_List_sorted.csv', index=False) Thu May 5 07:24:23 EDT 2016 Got some sleep! Also did the csv-sorting work. Only titles remain! Maybe handle it a similar way. Start with a function directly inside report.py. There's really no reason to break it out to a separate file and repo first, like I did with sortcsv. It's 7:25. The best thing you can do for yourself now is ACTUALLY get into the office early. I can maybe even write this function on the train. Take my Surface Pro? This is the sort of reason you have it, isn't it? -------------------------------------------------------------------------------- ## Wed May 4 22:05:20 EDT 2016 ### Finally Installing Pandas... A Different Kind of Panda for an SEO Hmm, the last update was my morning one around 9:30 AM. Well, I've has a pretty full day of updates. I'll be getting myself into a merge situation, but I have to make myself feel totally free about going ahead and typing new entries, knowing that everything on the other machines I journal from will eventually hit here. Journal, journal, journal away, and god knows I sure need to tonight. I have a ton of work still to do, from finishing up the report automations, now including: - Sorting CSV files - Filling-in missing title tags - Adding 4 more properties to the automated reports Wow, there's not a moment to waste, and I can't let diminishing returns set in, either. Okay, that means getting VPN working from home, but I feel pretty confident about that after my experiments from work the other day. It's pretty amazing after how many weeks of Rachel having moved out that I still haven't had time to give the place a good organizing and cleaning. Even the little stuff every day is difficult with this new job ramping up. I'm just over 2 months in, and am just starting to pick up steam. This job tonight is A LOT about turning that high-energy-in of the build into self-sustaining momentum, and making all subsequent jobs easier and easier. Now, get some coffee brewing and do a kick-ass job. I had a dream last night that I was fired from my job for not doing those reports quickly enough... sheesh! Got to learn not to over-promise. Pad everything four times as long as I think it should take, and that will probably be how long it actually takes. Going through a lot of these things for the first time doesn't help much either, like the nuances of connecting to Amazon Redshift. Speaking of which, that should be my first order of business... doing that from home. Pump yourself up on caffeine and keep yourself journaling and honest. Every moment is precious now. Don't start losing your capacity. It's perfectly fine to burn the midnight oil in situations like this. The brain and the body can override natural rhythms on occasion... the survival instinct. Thu May 5 01:19:28 EDT 2016 Ugh, took hot bath to infuse some second wind into me. Okay, and a second coffee coming up. 1, 2, 3... 1? VPN! Done. Okay, now try running report.py... ugh! pg_config again. Okay, this looks like a job for homebrew! /usr/bin/ruby -e "$(curl -fsSL https://raw.githubusercontent.com/Homebrew/install/master/install)" brew install postgresql Fri May 27 11:52:41 EDT 2016 ADDENDUM: I had to (on another machine) do: export PATH=$PATH:/usr/local/Cellar/postgresql/9.5.3/bin sudo pip install psycopg2 (a whole bunch of stuff) Successfully installed psycopg2 Cleaning up... Woot! Okay, still had to clone the repo that contains the SQL text files into a directory next to the reports repo, and also had to create the module that had the database login credentials. Luckily, I was well prepared. My "only works from the office" work is now working from home. Nice. Oops, had to update my Dropbox path. Damn, I wish those directory structures were part of the POSIX standard. Okay, done and it ran successfully. Now, add the other properties. Test one new property first. Done. Do the rest. Done! Woot! Okay, I don't have GA for one of the 4 new properties, but that's fine, because the GA stuff is being handled directly in Tableau by my counterpart in SF. Anyhoo... 1, 2, 3... 1? Ah! Sorting the csv files. Easiest way I found is with the pandas library. Kind of like using a machine-gun to hit the ball in chip-and-putt, but so be it. If it installs clean on OS X at home, I'll have no problem on Ubuntu at the office. It's dependent on numpy, and that's always a large interesting install. As it's going, I'm getting tons of warnings about loss of integer precision during conversions. Not a good sign. Here's the problem: http://stackoverflow.com/questions/12436979/how-to-fix-python-numpy-pandas-installation brew install python (I may have to re-install other packages again) pip install --upgrade pip pip install pandas hash -r python Okay, I can run python and import pandas. Let's commit this crazy entry, and move onto the actual csv sorting... and then to titles... and then to final menu selections... ugh! -------------------------------------------------------------------------------- ## Wed May 4 13:29:43 EDT 2016 ### Against The Clock Okay, back to data automation, and the title tags. Work smart! Don't chase any rabbits! First thing, lock-in on just working on Ubuntu. You don't really need to switch virtual screens that much. Make sure you have a way to see emails come in. Minimize the need for screen-switching, and really distractions of any sort. We have the SEO Triweekly coming up in an hour, and I'd like more to show Marat before then. It's taken me until now to get back to what I had to painfully put down on Thursday. 1, 2, 3... 1? Okay, remember where your work files are. They're at root's home. Okay, move them back to /home/myusername/... done. Load report.py into vim. I'd like to control which properties run from inside report.py instead of having to edit a json structure in cfg.py all the time. Hmmm. Wed May 4 14:50:17 EDT 2016 Just had meeting with menu-project stakeholder... woot! I am soooo glad I am here at this organization. It's almost already 3, and I have to: - Debug the reports so they ALL get generated - Add titles to 2 of the CSV-file outputs Ugh! I need (want) a 2nd screen now with Ubuntu. Okay, VBox is capable of that, so let's do it. Going to 2 monitors with Ubuntu 16.02 under VirtualBox on a Lenovo ThinkPad laptop (with docking station). Don't chase the rabbit, but do look for a way to get virtual screens now within the Ubuntu dual-monitor virtual machine. Okay, got it set up the way I like with 3 horizontal virtual desktops and only 1 up-and-down. gsettings set org.compiz.core:/org/compiz/profiles/unity/plugins/core/ hsize 3 gsettings set org.compiz.core:/org/compiz/profiles/unity/plugins/core/ vsize 1 Wow, I had to turn 3D video acceleration back on, and now I have virtual desktops... called Workspaces under Ubuntu. I need to do a few tweaks, like: - Show the menu for a window in the window's title bar (as opposed to the menu bar) - Make menus visibility Always displayed (as opposed to displayed on mouse hovering) I also had to do a bunch of chowns to take back ownership from when I got everything going as superuser. Yeah, yeah, I know. Okay, I can run all the report-types for one of the properties. Try another. Oh wait no, before trying another, do just a little bit of clean-up. Do some output to show when you're writing the CSV files. Ugh! I had my first complete system-freeze in Ubuntu under VirtualBox. Serves me right for all the awesome I'm trying to squeeze out of it. At least it does a forced reset fairly quickly. Wondering if Dropbox is the culprit again. Wed May 4 17:02:55 EDT 2016 Okay, here's the ultimate 80/20 rule test (for me). I'm going to turn off the GDocs part of my script... okay, better global control variables in the cfg module. Yeah, yeah, globals are baaaaaad. Four legs good. Wed May 4 17:23:33 EDT 2016 Okay, next! Well, it's time to make the topurls and commerce functions into the same function, with an extra parameter to make it behave the two different ways. They're almost identical except for the sqltop portion and the name of the files that get output as a result. Wed May 4 18:35:38 EDT 2016 Ugh, just got pulled into helping wrangle some other parts of the project that is not coming from my data-pulls. Lost over a half-hour on that. Get back on-track and plow through what's left. And now it's almost 7:30. I got through a couple of the bugs. Wed May 4 20:27:45 EDT 2016 Okay, I have the first round of SQL all executing. Now, for the second -- which should be much easier, because they're all almost the same query in this second case. Wed May 4 20:44:15 EDT 2016 Okay, all this SQL is executing very nicely. After those are done, I will run the Webmaster Tools and GA ones before I leave (the GA ones are out not in sync with the latest refinements, and unnecessary). But I also need to do a few follow-up items. It's almost 9:00 PM, so I will do them at home... I will make sure I really CAN do them at home. Sheesh! This may be a late-night thing, for real this time. -------------------------------------------------------------------------------- ## Wed May 4 09:37:52 EDT 2016 ### Mild-to-Medium Pressure Okay, even with the time-crunch of deliverables, the fire-drill, and NOT working during the night last night (as I had planned again), I can still make today an awesomely productive day. Just work smart! Okay... hmmm... I have two key things today: 1. Finishing the last "Strategy" of the menu work 2. Sending an email to stakeholder saying I'm ready to discuss 3. Switching over to the data-pull automation work (getting all files generating) 4. Doing the title-pulling work (ScraPY vs Requests + BeautifulSoup?) Wed May 4 10:11:28 EDT 2016 Just came outside and walked over to Madison Square Park to dodge the fire-drill and get a little offline work in before heading back up. Touched base with my boss with my plan, and he said sounds good. He likes updates and plans, but I have to keep my long-windedness and erudition in check. Also, don't hop on any unsecured WiFi nets out here, no matter the temptation unless I want my identity stolen. Okay... next? Oh, the PowerPoint of course, which I have available offline thanks to Dropbox and can edit on my Mac thanks to my own personal copy of Office 365. Okay, the PPT work was easy enough, but that 160MB Excel file brings my poor little 4GB i5 Macbook Air to its knees. I'll have to do the actual spreadsheet work back in the office, but dodging the fire-drill was worth it to get the PowerPoint slide done. Correction! After I turned off Dropbox desktop integration, Excel becomes just this side of usable, and I can do the work outside while I make very certain the drill is over, and I'll be the first in line for the Shake Shack. Ugh! There were pre-selected filters that I have to release before the copy-and-paste work. I can't believe that's a big enough deal providing enough waiting time that I can mention it here. Old laptop aside, getting this stuff off of Excel and onto a killer Web-UI will be one of the biggest favors I can do for this company. Shakeshack order in, to go. Will take back and eat at the office while I do the copy paste work and complete the deliverable. Just release the filters and save so you're ready to hit it hard when you get back to the office. Releasing filters and saving, bringing Mid 2011 Macbook Air to its knees with a mere 160MB Excel file. Ridiculous! Wed May 4 11:15:29 EDT 2016 Okay, back in office and my newer Lenovo laptop on an older version of Excel (2010) handles this file about a 10 times faster than my Mac. Combined all the modifier tabs into GSheets. Now, look for where the best 2016 data comes from. Wed May 4 12:29:02 EDT 2016 Okay, just delivered the PPT deck to boss as ready for the stakeholders. Go look at the GSheet to make sure it's likewise in good shape... Okay, it is. Also went over with boss. Final reco's not really there yet, so must set expectations for final delivery tomorrow, while I switch gears back to the automation work. -------------------------------------------------------------------------------- ## Wed May 4 09:22:26 EDT 2016 ### Important Thoughts (Read Again) Don't chase the rabbit down the hole today! Pushing out video on YouTube right now. We have a fire-drill coming up at 10:15. I have my alarm set for 10:00 AM. Maybe I'll step out of the building with my laptop and keep working. I need to do more building-up projects that really enlighten people... Enlighten people on: - An "Old-School" Linux Terminal Environment (Server) - Editing with the vim text editor - Installing CygWin and all the things you need - Running Python in interactive command-line mode - The basic Unix commands ls, cd and touch - Creating your first text-file with touch Oops, rabbit hole! Remember this later. Hmmm, I need a way to remember these things later. Oh, and here's my subway-writing from this morning... Leverage the tools around you for precisely what they're good at doing. Don't put yourself at a disadvantage against others using these same tools you refuse to pick up for nonsense reasons. PowerPoint is probably my biggest offender here. Sure, there's other ways to make slideshows like Prezzi and browser-based stuff, but PowerPoint is the only way you're going to be able to forward files around in email in most (not-yet modern) business environments that communicates what you need in a linear, slideshow-like environment that chunks your messages into absorbable units just-so. So, PowerPoint it is, no matter how viscerally opposed to this my gut reaction is. Pipulate has issues like this. For any single thing Pipulate does, there's another tool that does it better, such as ScreamingFrog or Deep Crawler as a spider to crawl websites. However, Pipulate brings interesting possibilities into the mix by using an almost transparent framework that brings the world of headless (disposable) Python servers close to the world of Excel-style spreadsheets (which are really Vizicalc-style) living in the modern Web browser. A few tricks like ridiculously easy server deployment and the way JavaScript bookmarks work bridge this gap in unexpectedly dot-connecting ways -- ways that open the door to infinite new, increasingly easy follow-on work -- the promise of and onus for "frameworks". Now, the magic of Python is how close frameworks built on top of Python are to the underlying Python language and environment itself. Python frameworks, with a few exceptions like Django, tend to be lightweight last-mile extensions to Python that are just ridiculously easy to understand and use. For example, the framework that turns the most decidedly non-webdev-friendly native Python environment into one roughly the equivalent of the very webdev-friendly PHP is called Flask. Now, if you don't understand how Flask is working, you can try another one called bottle.py, which essentially puts everything that turns Python into a template-driven webdev tool into one file that sits in the same folder as your work files, so you can easily load it to reference what the heck is going on. I switched from bottle.py to Flask for the Pipulate project only as I saw bottle.py growing more Flask-like, in its inclusion of the Werkzeug package for handling routing (just like Flask), and I figured once bottle.py had a dependency, I might as well go for the massively popular mainstream version of the same thing (Flask). Now, the real magic behind Flask resides in how it cleverly connects the dots between Werkzeug for request routing (using a really neat trick called decorators) and the Requests package for simplified http requests and the Jinja2 Web template system for PHP-like blending of a dynamic programming language with HTML markup to format things for a browser. That's Flask, Werkzeug and Jinja2. Some would also throw in Urllib3 into this dependency-mix, and it's true. And there's another for sanitizing data against hacking attempts. Now it's starting to sound a little less lightweight than it did when I first started to describe it, doesn't it? Think if it as the work addressing different core components of the microframwork being divided up over several different dedicated and specialized individuals, all of whom submit their work to the scrutiny of the free and open source software community, usually revolving around Github.com. Plus, Flask is used in may serious large-scale deployments that constantly temper it in the fire of real-world use. And at the end of all that, it's something that can be installed on a virtual machine to run with under a 50MB software footprint, and that includes the OS, Python, Flask (and all its dependencies) and the virtual machine program which you can download and start using right now on your Mac or PC with no software install or admin rights. Now THAT'S micro. There's a common misconception that Flask does not scale. Miguel Grinberg has a tutorial to show that this is not true. Reddit is built on Flask, but they did replace the Jinja2 templates with the Mako template system. -------------------------------------------------------------------------------- ## Tue May 3 13:35:41 EDT 2016 ### Finish Line in Sight Pipulate's usefulness will explode 1000-fold once I process the lists VERY FAST in the background, and chunk out and dump it into GSheets. I also want to finally do the looser coupling and higher level abstraction, so that even GSheets and GSpread is optional, as originally envisioned. This will be hot. But this is another rabbit hole. Don't pursue. At most, make notes on it. But right now... right now... get that menu done. Tue May 3 15:02:28 EDT 2016 Okay, I have to slam through this menu work... pressure's mounting even much more. Tonight can be a go-as-late-as-you-need night. I've had some pretty good focus so far today. Tue May 3 20:31:45 EDT 2016 Wow, finally leaving work tonight. Not done, but really happy with progress (so is the boss). Woot! Finish-line in sight. -------------------------------------------------------------------------------- ## Tue May 3 09:23:46 EDT 2016 ### Parsing Keywords Out of Referrers Variable With Python Okay, let's drive these projects to completion. Less writing here, and more into PowerPoint, etc. Finally, I made the menu-work that I'm doing "love-worthy" in my mind -- a critical first step to full immersion and highest quality work, and faster by virtue of self-fending-off distractions. 1, 2, 3... 1? Oh! I need a better striking distance report! Webmaster Tools / Search Console is really terrible about showing you keywords that you're not really positioning well on yet. So, bringing a bit of HitTail magic to the equation. YES! Do that fast... 1, 2, 3... 1. Write the query... done. Very nice. But I need to parse the referrer string now. I want Python! I can have Python... wow, let me make a new private repo and slam this one out. Tue May 3 10:30:23 EDT 2016 Okay, I wanted to try one more time to get the Amazon Redshift development working on my Windows desktop with Cygwin. Upon research, I see that the development tools for PostgreSQL can be installed from Cygwin, which allegedly gets the pg_config.exe file, so I'm installing that now. I also put the entire MinGW toolchain for both 32 and 64 bit systems, just as a precaution in case anything needs to be compiled. Rabbit hole! Rabbit hole! But worth it, if it's not too deep. Not knowing which thing you tried made it work... ugh! Okay, I got so close, but failed. That was hours lost on trying to get SQL queries to run under Cygwin under Windows... such a simple thing, but so many nuances. I've installed so much for PostgreSQL under Cygwin to no avail. I got pc_config.exe installed and the path set, but now it's tripping up over all this windows thread stuff, which I also installed from Cygwin, but to no avail. My final error is: Command "/usr/bin/python -u -c "import setuptools, tokenize;__file__='/tmp/pip-build-fomN2J/psycopg2/setup.py';exec(compile(getattr(tokenize, 'open', open)(__file__).read().replace('\r\n', '\n'), __file__, 'exec'))" install --record /tmp/pip-D937H4-record/install-record.txt --single-version-externally-managed --compile" failed with error code 1 in /tmp/pip-build-fomN2J/psycopg2/ Just go back to either Mac (which I have working now) or the Ubuntu virtual machine. I'll use the virtual machine for now, so that I can keep this journal displaying on the Mac all the time. Ugh, I am going to do it on the PC. It's just too much grief to have my keyboard shortcut for switching virtual screens to stop when I hit the screen-pair that has VBox and Ubuntu. Such a small thing, but the interruption to work-flow is incalculable. And so, just execute the query from SQL Workbench/J on Windows and export the dataset. Okay, it's only 9371 lines. No problem. Wow, this should be super-easy. Damn, don't chase rabbits you fucking idiot! This is plopped into my ~/ on Cygwin, because my PC desktop is still my main working environment here, even for Python, which I use through Cygwin. Oh, speaking of which, I'm going to uninstall Python 2.7 from python.org. Fewer moving parts. The dedication behind Cygwin is actually quite immense. I'd rather be in that court than the official Python distro court, due to... well, I guess it's effectively the Cygwin repository. Let's see how efficiently we can do this. No repo needed. Not everything ad hoc has to be elevated to repo status... I can use Github snippets if I want. with open('filename.txt', 'r') as afile: for line in afile: print(line) Tue May 3 13:15:35 EDT 2016 The more developed version of the thing is now: from referer_parser import Referer deduped = set() with open('strike.txt', 'r') as afile: for line in afile: keyword = Referer(line).search_term if keyword: keyword = ' '.join(keyword.split()) keyword = keyword.lower().strip() deduped.add(keyword) afilter = ['xxx', 'porn'] for aitem in sorted(deduped): if not any(x in aitem for x in afilter): print(aitem) Wow, that was pretty fun to work on. Good list. This stuff is definitely going to make me hone the SEO Kung Fu that I began to develop in Pipulate. -------------------------------------------------------------------------------- ## Mon May 2 20:58:29 EDT 2016 ### Health Care Cards What a strange pleasure for it to barely just be 9:00 PM, and I'm fed, and this time is mine. Wow! But I have a lot I want to do for work too. I am allowed to putter. I should align my puttering better with all the stuff I need to get done. I just got my health plan cards, and I believe Adi will be going to see a dentist. She will be 5 and a half next week. I need to act more like a grown up, and I will have time to get myself organized enough to act like a grown up. The level of disorganization that I allowed myself to slide into is not a grown up characteristic, but rather that of an undisciplined child. I was never that financially or behaviorally disciplined in the first place, and I let myself become even worse. But I brought Adi into this world, and that means everything. What I have to do now is help ADI get organized... better than my parents helped me. I was the after-thought... in all things. I could almost not be allowed to get the better of anything, or even anything that looked good. Big deals could not be made out of me or anything having to do with me. I see that now. And it wasn't just my parents doing it to me, because after awhile I started doing it to myself. No wonder I hate celebrating birthdays. It was self preservation. So much destructive patterns to help keep from setting in unawares with Adi. -------------------------------------------------------------------------------- ## Mon May 2 15:30:22 EDT 2016 ### VPN Credentials for Working From Home Okay, I'm running out of time, but not REALLY running out of time. Every day I go home can be working late into the night if I need. Okay, figured out getting connected via VPN with the Mac. That'll be key to working up late into the night on FULL REPORTS, Amazon Redshift SQL included. Make sure that's ACTUALLY true. Install SQL Workbench/J for the Mac, and then connect... okay did it! Connected through Mac's built-in VPN under System Settings / Networking, using the WiFi connection provided by my phone's hotspot... woot! I'm in business (at home). Okay, things are going well. Switch over to the other journal to think through process of the current project, and where you want to have it at when you get home. Mon May 2 17:58:10 EDT 2016 Okay, go home at a reasonable time and work on a lot of this at home. You cleared the way. -------------------------------------------------------------------------------- ## Mon May 2 09:25:14 EDT 2016 ### A Less-than Fully Focused Day Ah, I love it when a git pull doesn't produce any conflict messages. Back in the office for a focused and concentrated run, until it's done! Even typing this is a luxury you don't have, so get things underway thought-wise, but then keep it right on the edge, so you can come back and write. This is a place where the Timezipper project mentioned above would kick into play to make sure the YouTube video I shot and posted on my way into work got embedded here. Installing VirtualBox 5.0.20 upgrade that was available this morning. Funny how Oracle owns it, along now with Java and MySQL... oh, and Berkly DB before it. Wow, there are a lot of approvals for drivers for the VBox install. Flipping through FM Radio in my car (for the 1st time in awhile), I discovered Leo Laporte again, and he was talking about Chromebooks and ChromeOS. Ugh, well, maybe installing the new VBox was a rabbit hole, but I wanted to do it perchance it offered more options for my pointer-capture nightmare. VirtuaWin works SO WELL screen-switching Windows 7 on dual-monitor that I want to make a mere one virtual desktops into full-screen Ubuntu, but even on the second monitor, it still captures the mouse when swapping screens, interrupting further hotkey screen swapping. Rabbit hole! Rabbit hole! Don't chase. Get back to the urgent work. Reboot machine to free it up from the software installation penalty box... crosses fingers. Okay, restarted. Wait for hourglass to disappear before pushing my luck. Mon May 2 10:48:31 EDT 2016 Deposited my paycheck after being overdrawn from auto-pay 1st of Month. I should re-schedule my Mortgage to not be exactly 1st of month, for a little safety-margin. Amazing to have to deal with these things for the 1st time in my life. Healthy to have to budget like everyone else. I love this corner-desk arrangement. Now that I have time and room again at the apartment, I should think about an even better desk arrangement... but not now! Rabbit hole! Onto the assigned task of the day... but not before you touch-base with Marat... better-still, get that 3-times/week meeting onto the calendar. Okay, got it on the calendar. Nice simple system: Mon, Wed, Fri from 2 to 2:30. Easy to remember, easy to get-through (half-hour). Gave it a nice, strong nickname of the SEO Triweekly! Less misunderstanding there than with the ambiguous biweekly. Okay, next! Damn, I'm on the edge of a pretty amazing way of working. Type here most of the time when you can, and dump garbage onto the net, but only as a single long markdown page... hahaha! What a delightful experiment. It's my thought-process... wholly and completely unique content of the complete opposite of spam-cannon variety, but still terrible signal-to-noise ratio nonetheless. Okay, okay. Enough rambling. Now onto the project at-hand. Just added the beginning of a book outline at the bottom. Things are gradually clarifying in my mind, about what sort of book could BOTH feed my soul AND be commercially successful. Super Serum! Python Super Serum? Hmmm. What bandwagons in particular are you hitching onto? What venn diagram are you trying to reduce your potential audience to? Rabbit hole! Rabbit hole! Back to business. 1, 2, 3... 1? One must obviously be loading the very spreadsheet I'm talking about completing. Sent house-keeping email to Marat to make sure I don't drag my feet there. So, load that sheet... Mon May 2 12:18:01 EDT 2016 Newsflash! Potential higher priority work popped up... no, cleared it. But I still have 2 tall orders on my plate for today. Finish up the one you were planning on working on, then shift to the other. Clear the rest of the week for testing and refining the raw data feed work and the contingency reports. Don't stress yourself out with pressure. Work smart and work well. Also, keep the outer-loop Happisheet project in-mind. Mon May 2 14:50:03 EDT 2016 There's very little chance of me actually getting the project slated for today actually done and ready for presentation in 3 hours, but I can try. I just may be sitting on my first uninterrupted (mostly) 3-hour stretch for the day. That's still a good stretch, if I can make sure I don't hit diminishing returns. -------------------------------------------------------------------------------- ## Sat Apr 30 11:58:39 EDT 2016 ### Heading to Catskills For Opening Life is full of these little fractally edges that I don't do well with. Life doesn't move forward in smooth, easy steps. There's lots of messy details and lots of thrusts and jabs and false starts and losing ground, only to hopefully, albeit somewhat frustratingly regain it. I have not chosen an easy path, but I am determined to stick with certain key aspects of it, so that my daughter ends up in as positive of a place in life as she might. One of the nicest experiences that I'm re-discovering is puttering. It's amazing to not feel hurried and rushed onto the next thing, without having finished the last. I've been living by the 80/20 rule to almost an impossible level for myself -- NEVER chasing perfection, because of a sort of fatalistic defeat. But no longer! I make a little positive progress every day. Almost got set back, but fought and resisted. Stood up to the storm, and became a larger prevailing weather pattern. Felt good. Just explained cursor movement in vim to Adi. Getting ready to go now... -------------------------------------------------------------------------------- ## Fri Apr 29 20:56:24 EDT 2016 ### Got Latest Work Running From Home Mac Another fine day with Adi. I was actually working for a significant amount of the day from home, and still got some work done -- amazing! I still have some to do over the weekend, including enough thought-work to impress some folks at the office on Monday. I don't want to pull the procrastinator's pressure-trick. There's a golden middle of creatively inspired work, without compromising on quality or putting excessive stress on you. This is the "I work better under pressure" crew. And they do. Studies show there's a golden middle. The most creative ones are not the far-in-advance planners. It is the confident kooks who goof around delaying just long enough to have allowed the subconscious and other sentience subsystems are at play a chance to lay the latent groundwork of a better answer -- an unlikely, and difficult-to-see answer that actually TOOK that time to percolate. BAM! And tonight, I needed to run reports.py on a Mac. Ugh! There were Mac dependency problems, specifically in the Google Python API that gave me this error: File "/Library/Python/2.7/site-packages/oauth2client/client.py", line 491, in _update_query_params parts = urllib.parse.urlparse(uri) AttributeError: 'Module_six_moves_urllib_parse' object has no attribute 'urlparse' And some googling, somewhere on stackoverflow was the advice: sudo pip install -I google-api-python-client==1.3.2 And indeed, that did solve the problem. So now, I have reports.py running on Windows, Mac and Ubuntu. Private repositories in github does make quite a difference. Yeah, I know, bitbucket. But there's just something about github. -------------------------------------------------------------------------------- ## Thu Apr 28 21:17:48 EDT 2016 ### Sad About The Catskills Well, the writing's on the wall. It's probably the Catskill place I'll end up giving up. It's way too much redundancy in my life. I have TWO places there, essentially (bought both units of a double) and am paying twice as much as anyone else, and aside from picking Adi up now, is the only reason I'm keeping a car, which makes me have to keep a parking spot too. Just by getting rid of the Catskill place, a whole cascading series of money-saving is possible. But it can be slided and diced so many ways, such as selling only one of the two Catskill units. But that only lightens the burden, and keeps pretty much the same surface-area in terms of mind-share and slice-of-my-life maintenance. Now that the family is broken up, there's not a decision of this type that can't be re-thought. Don't over-build how significant it may be to Adi's future. There's plenty of other things to fill that role in her life. I have no doubt it is very special, but it's not be-in-it-alone special. I need help that I don't get. And that pretty much makes the decision for me, doesn't it? -------------------------------------------------------------------------------- ## Thu Apr 28 11:45:27 EDT 2016 ### Adi Worked Late With Me At The Office Okay, next! I need to load the SQL that's executing from off of the drive. Fortunately, file access isn't as tricky in Python as it is in other languages I used in the past. It's a simple open call. Thu, Apr 28, 2016 1:40:16 PM Okay, switched to my PC again. Adi's using my Mac laptop. It's amazing how much I need this journal to think. Okay, it's like I'm very practiced at this now. I don't want to expose the queries themselves in the report repo, but interaction with the database and the logic of formatting and outputting the results is fine. In particular, I need to make sure everything ends up as CSV files on the network drive. Okay, so... 1, 2, 3... 1? Make sure you can access a file with Python OUTSIDE the current directory its being run from. In particular, I need to go up-and-over with a ../core/file.sql Thu, Apr 28, 2016 3:15:50 PM Just had phone-call with Marat and Bert about the Tableau reports. My urgent mission is to get the automated data pulls done and a sample report based on it ASAP! Hmmm, okay, next step? Incorporate the SQL from the core repository into the report repository and move over config file... okay, done. The main reports.py is now performing SQL queries against the Amazon Redshift databases we call Core. BAM! Okay, I'm in the final stretch. I need to make it hit the CORRECT SQL files. Okay, so un-hardwire the reference to the files. Thu, Apr 28, 2016 5:48:43 PM Okay, almost there. All CSV files being created correctly, and shuttled out from Ubuntu VM via Dropbox... wow! Anyway, only the last SQL query, which is almost identical to the first one left. Go do, then do the formatting from home. Don't forget to send Marat the files. Thu, Apr 28, 2016 6:39:34 PM Done! Adi worked late with me at work today... hahaha! -------------------------------------------------------------------------------- ## Thu Apr 28 10:24:31 EDT 2016 ### Connecting To Amazon RedShift From Python Using psycopg2 Okay, I switched Dropboxes on my Mac at the office to use my ZD dropbox account, and lost my .vimrc file reference, so I just did a git clone of git@github.com/miklevin/vim into the new dropbox location, and BAM! Got all my syntax highlighting and macros back. Woot! Okay, challenging day. Adi's here at the office doing her classes, and I was already called down one for her needing me. I think I got her settled, and hopefully she'll love what's going on. I think she might. It gets into robots and stuff as the day proceeds. This just might be the sort of thing that profoundly impacts her, in the positive way of course. But let's get to business. 1, 2, 3... 1? First thing is to get back into automation-land and take a stab at a big data refresh. Now, if I'm going to update all the data at all, I might as well automate the whole process, I'm so darn close. So, 1 is getting my VirtualBox with Ubuntu running again... this will always be my 5th (or right-most) virtual desktop in my VirtaWin layout. I used the VirtuaBox save-and-restore state feature for the first time. Ugh! Mouse disappears whenever I click inside Ubuntu. Still providing input -- you just can't SEE the mouse. No time for Murphy today. Restart immediately and never suspend/resume Ubuntu again. Either leave it running or shut down completely -- can't afford the frustration. Ugh! A restart didn't help. I'm selecting "Reset virtual machine"... this had better work. VirtuaBox on a PC is not looking very strong versus VMWare Fusion on a Mac. World of difference. Okay, the solution seemed to be to turn off "Enable 3D Acceleration" under VirtualBox/Settings/Display. Go figure. Anyhoo... 1, 2, 3... 1? My new step one is to make sure the scripts as I left them still run. Confirmed. Undertake final automation steps? Perhaps. Take a quick grab at it. Getting the data refreshed and in location is such a critical step in making everything else easier. The things I'm afraid I may not be able to provide in this first pass (in an automated fashion) are: - title tags: I can post-process outside automation) - commerce clicks: technically possible, but requires a revisiting of some SQL I would rather not revisit. 1, 2, 3... 1? Ah! Getting the SQL from github. Did a few fixes to the SQL. Now, what I need to do is put what I did in the interactive Python console as a Hello World success-assured test into a generic "execute this" script. Once that's proven, externalize login credentials and add to report repo. Then, incorporate it formally into report.py for full automation. Okay, that's a plan. Go fast! Re-do the success in the console... import psycopg2 conn = psycopg2.connect=("dbname='blahblah' user='human' host='foo' port='bar' something='somethingelse'") curr = conn.cursor() curr.execute('SELECT TOP 10 * FROM TABLE') curr.fetchone() And that's it. I got the data back. Good ol' Linux platforms. Next step! Put that into a sql.py file and make sure it runs as a command-line Python script. Okay, done. Now, externalize login credentials. I'm tempted to use pickle, but I'm just going to use a human-readable JSON-like Python file that I import, just like I use cfg.py, but it will be an obvious/simple username/password JSON shape from how its invoked. If people can't produce the proper config file from reading the Python code, they have no business setting it up for automation. Okay... here it is: import psycopg2 import myob dbname = myob.constr['dbname'] user = myob.constr['user'] host = myob.constr['host'] port = myob.constr['port'] password = myob.constr['password'] constrtuple = (dbname, user, host, port, password) constr = "dbname='%s' user='%s' host='%s' port='%s' password='%s'" % constrtuple conn = psycopg2.connect(constr) curr = conn.cursor() curr.execute('SELECT TOP 10 * FROM sometable') for onetup in curr: print(onetup) -------------------------------------------------------------------------------- ## Thu Apr 28 07:40:37 EDT 2016 ### Bracket the Problem Waking up Adi. Gonna be fun. 20 min to get her brushed, dressed and on our way. We'll pick up any breakfast we want on the way. I've got the itinerary in my pocket, and the release form in my back pocket. Lunch I'll figure out as I go... maybe pick up something while she's in ZD class. Gotta look at the schedule and get-down the times I need to meet her and where. The whole thing is over at I think 4:00 or 4:30, so it's a crunched and compressed day. Every moment not spent with Adi will be precious in getting this done. And I will get this done. Will be thinking about it and organizing it in my mind as I go. Bracket the problem. Defense. Offense. Long-term. Short-term. Opportunism. Long-shots. Safe bets. High risk tolerance and quick, small banking of winnings. Different strategies to mentally bracket, and slice & dice and hedge accordingly. Will be fun. -------------------------------------------------------------------------------- ## Thu Apr 28 07:19:32 EDT 2016 ### Taking My Kid to Work Today is Take Your Kid to Work day, and Adi was old enough at 5 years old, so Adi is going in. She's going to have to do as she's told. This is going to be interesting. I also have a fairly ambitious deliverable to deliver today. Again, this is going to be interesting. But I will sort it. Because that's one of the things we humans are good at -- we sort... we sort, when the index requires significantly fast pattern recognition in order to use. If there's a numeric primary key, computers sort better. But I will manage it all today, AND I will excel and put a finer point of methodology on it than one might expect goes behind a project like this. I can do that, because now that Adi is old enough for me not to be 100% present in her interactions or 100% present as the responsible adult in charge when it's not mommy, I can think about other things occasionally when she's around... such as this. How to... well, that's for another journal. Later, gator! -------------------------------------------------------------------------------- ## Wed Apr 27 13:53:15 EDT 2016 ### A New Primary Menu SEO Reco Deliverable-type NOW... Now... now, we do a new category of SEO deliverable than I've done yet here, or perhaps even at all before. Take your time (such as it is), and think about what format will surprise and delight your stakeholders. What are your tools, data, resources, etc. for this project? And THIS is where we switch over to our proprietary journal. Not for you! -------------------------------------------------------------------------------- ## Wed Apr 27 13:46:26 EDT 2016 ### Time-off Request For Friday Submitted I sent my request through ADP for Friday off. Adi will be staying with me tonight, and coming into the office tomorrow with me for take-your-kid-to-work-day. I had to get that ADP request in, so I could keep her through the weekend. This is opening day at Spring Glen Woods, but frankly, I don't know if I want to go. It was a "family decision" when I got into it, and now that it's just Adi and me, I don't know if those original premises are still true, and (in addition to the money aspect), I would be trying to get that stuff off my mind while still, and actually even more-than-ever, only being in it on my own, in terms of cleaning and maintenance. It became strikingly clear to me last season, and that's probably why I'm dragging my feet on just about everything regarding that place now. Proceed with honesty about yourself and Adi, and proceed with intelligence. Factor out all the old, no-longer-applicable considerations. -------------------------------------------------------------------------------- ## Wed Apr 27 12:03:43 EDT 2016 ### Life Moves Forward In Violent Jerks My meeting with to deliver my findings from yesterday to the site's stakeholders this morning went very well. I am truly in my element here. Gotta put some structure around these investigations. Make them very, very modern and SciFi-like -- probably, primarily with Gephi. Damn, I wish DeepCrawl could export "edge" data for gephi, but it's got to be ScreamingFrog for now. There are other approaches to crawling a single site at-scale that could make the edge data available for export and visualization (ScraPY?). Anyhoo, there's no reason to deny yourself food these days. Wed Apr 27 13:14:55 EDT 2016 Got food and sent by FeEx my payment for the Catskills co-op bungalow along with a message that I may not be able to afford it anymore, and will get it ready for showing. I am late for the deadline they set to get the payment in for them, but it's the first time I actually had the money and my head wasn't under water. Haha, now I start talking about THIS sort of stuff on my daily journal. So be it. It's part of what makes it interesting, that I'm dealing with all the trials and tribulations of life here, and not just which Python package is best for my current projects. I am over-extended again. And it's not just financially. It's emotionally and capability-wise. I cannot be in all this alone -- or if I am, I can not be spread as thin as I was when I thought I was NOT in it alone. But truth be told, I was in it alone then too -- on the basis of payments, repairs and maintenance, transportation, and just about every other measure. She even stopped using the place more-or-less entirely last season, so why should I keep myself so enormously extended. If I give up the Catksills place (really, the Shawangunk ridge), then I can give up the car and the parking garage space for the car, and I save boatloads of money that I could either save, or apply to other things. Alternatively, I could give up JUST the bungalow but keep the car, and have endless weekend adventures with Adi, with a new destination every weekend, with the old Catskills colony just being one of the places we could choose to go. Day-trips would be no problem, and overnights, we could stay at her Grandparents place, which makes a lot more sense than the minimal usage we've been getting out of our double. This is all about surface-area. I have too much surface area exposed right now. It's time to contract and pull it in a bit. Less over-extending. More conservative approach to life, in general. Instill this into Adi as well as we go. The grandparents Catskills place will probably be enough for her keep that in her life. And start thinking seriously about whether you WANT to keep the main NYC co-op. I've had to go to war with the neighbors over the years, not of my own choosing, and those are wounds that will probably never heal. My tolerance for misery and pain in everyday life is probably a few notches higher than your average joe, but I can only sustain it so far. Now with her gone, maybe it gets a little better, but its an awful lot of space for just one person, plus a kid only on weekends. I bet I could keep plenty of continuity with Adi even without that fucking place in her life. I think I've had just about enough of co-ops (never again). And THAT was my lunch break. So, what now? Oh yeah, my even-more-urgent need to kick-ass at work so that I can keep earning what it is I was able to claw my way back up to, after my fall. Perhaps it's all for the best, because at least this way I finally have a "product" of my own in Pipulate. Sure, it needs some cleaning up, but everything about the project is awesome. Even the cleanup is going to be an online event, able to generate tons of modern currency. Generating modern currency... yep. I'm onto something. The tardigrade circus and that. 20 years 'till "retirement". Make more revenue streams. -------------------------------------------------------------------------------- ## Wed Apr 27 10:03:52 EDT 2016 ### VirtuaWin Under Windows 7 & Ubuntu Under VirtualBox As much as I hate to do it, I'm going to have to quit out of the Ubuntu... oh wait! No, the purpose of my 2-monitor system has now become clear. While work on a primarily Unix/Linux-like system isn't my primary daily chore, I can move it over to the second (laptop's main) screen when docked at the office, because the main problem -- the way it captures the mouse keys when I'm swapping between virtual desktops per VirtuaWin -- goes away because my mouse never hovers over (and gets captured by) the VirtualBox/Ubuntu full-screen instance. It is never "under my mouse" because it's on the secondary screen. And then I have Ubuntu displaying ALL THE TIME, which keeps it top-of-mind for me, and probably enhances my mystique to boot. I should be using Arch Linux... hahaha, but no, I need to be on a Debian derivative, and EVERYONE seems to be jumping on the Ubuntu bandwagon. It's what ChromeOS is derived from AND it's the "bash shell" that Windows 10 is choosing to bake-in this summer. Switch Ubuntu's Brightness & Lock settings to turn off lock and never turn screen inactive. And remove the need for the password when waking up. The host machine will provide that type of security. This means I won't be keeping my Outlook calendar front-and-center on the second monitor. Keep an eye on your meetings! Okay, enough settling in for the day. It's already 10:15 AM! I need to start immersing myself in the data for today's project. Again, remember: emphasis on the data-behind and reasons-for each thing I recommend. Have numbers to back it up, and not just any numbers. I should take "competitiveness" and precisely who those competitors are into account, along with my best assessment of aggregate searcher intent behind each recommended phrase. Organization is key. You have to be more organized here than you have been in the past. First, there's all the locations where you have to get in the habit of putting things and looking for things. Oh, install Google Drive for PC desktop. Why have I waited so long? Okay, but it's done. Interesting! My Authuser 0 and Authuser 1 in my bookmarklets have switched which Google user its tied to. I gotta keep an eye on that. I may have to keep two sets of bookmarks (3 or 4 for that matter) of the same bookmarklets, but each set set to a different user number. My ZD account is on 0 and my personal gmail is on 1 at the office, but at home this may be different. I'll have to see how that syncs and "moves around" between my machine and Chrome instances. It is probably something I have some control over in Chrome settings. -------------------------------------------------------------------------------- ## Wed Apr 27 09:27:28 EDT 2016 ### Guido van Rossum is my new Jay Glenn Miner in my lexicon of modern tech heros Oh weird, I didn't do a journal update from home last night. I went to sleep on the early side and woke up on the late side, with the exception of dragging my ass out of bed to feed the cats at 5:00 AM, so I could have the last 2 hours of sleep in peace. However, I did none of the apartment clean-up that I had planned. It's going to be a cut-short week as far as what I can accomplish on the home-front, because Adi is being dropped off today so I can take her into work with me tomorrow. I consider this important and formative for her. These are the things I can do by only having one kid that I cannot pass up on. It is well within my ability to pull this stuff off, and it's square in the category of what's really important in life. Everything in Adi's life now imprints upon her GREATER than things that come later. This is core. This is foundational. This is the framework of her life. Making better frameworks. Making frameworks on-the-fly, without the creation of the framework itself being the point in-and-of itself. Joel Spolsky's attitude towards architecture astronauts is spot-on, and this is one of the many reasons so many people love Python. Python is what you might consider a low-level framework itself, even though it is a very high-level language. But as far as frameworks go, projects like bottle.py demonstrate how entire web-publishing frameworks akin to the entire PHP system itself can be created in Python with one reasonably short single file that just goes the final-mile. Python runs almost the entire race for you, but that home-stretch that shapes and defines the framework, for example differentiating bottle.py from Flask from cherry.py from webapp2 from Pylons from just about every other of the numerous WEB frameworks you can choose from for Python. There wouldn't be that many if slamming them out on top of Python weren't that easy. The cadence of my days is really shaping up. This is a somewhat more intense job than agency-life in delightfully different ways. The tasks I'm being put on are supremely aligned to my strengths -- even though there's ALSO other work to be done, my boss is directing me quite effectively, divvying out the work amongst his team, to each according to his/her strengths, and to a lesser-degree, available time. I've bitten off really just the maximum amount I can comfortably chew, but all the work is truly love-worthy, and I'm getting into it as if it were my own work that I was doing for myself, ala Pipulate, and that makes a lot of difference in contrast to throw-away one-off and mostly Microsoft PowerPoint and Excel work that you have to throw your heart-and-soul into in agency-life. Difference being... hmm, difference being... well, Python is the first thing that pops into my head. Python is taking the place of the Amiga Computer in feeding the soul, and Guido van Rossum taking the place of Jay Glenn Miner in my lexicon of modern tech heros. -------------------------------------------------------------------------------- ## Tue Apr 26 13:00:37 EDT 2016 ### Gumshoe SEO Okay, time to put my SEO-detective hat on and examine a site that's had a big traffic drop. Enumerate the tools with which you're going to look at the site: - Google Analytics - Google Search Console - DeepCrawl - ahrefs - The Internet Archive - MajesticSEO - Screaming Frog? - Gelphi? - Copyscape? Don't have too many predisposed presumptions about what's wrong with this site. There's plenty of ideas, but see if it's not actually death by a thousand papercuts, and see if I can't get rid of a few papercuts. I think there are several things layered up on each-other, and this is a peeling away the layers episode. 1, 2, 3... 1? Love the work! Become that force of nature. This is part of the tapestry of me becoming that force of nature. Here's how you structure the response. - HERE'S WHAT I FOUND - HERE'S WHAT WE NEED TO DO Perform a site:sitename.com search in Google. See how big Google thinks the site is. Hmmm. I want to keep working in vim and git as this information starts to get proprietary. Hmmm, okay. Let's see... I pay the $7/mo for private github repos, so... so... make a new repo with a bizarro parallel journal that gives me all the same benefits of vim, git and github, without pushing it all out publicly via github.io. I've tried tackling this before, but I wasn't ALREADY keeping private github repos. Tue Apr 26 18:07:11 EDT 2016 BAM! Detective work complete. PowerPoint... hahaha! Inevitable, but really best for summarizing detective work findings. And detective work it was. I love when I can be proud of even this sort of deliverable. Gumshoe SEO. -------------------------------------------------------------------------------- ## Tue Apr 26 12:44:45 EDT 2016 ### Awesome Blazion Quote From Yo Kai Watch My favorite recent quote: > Hungramps: Hello, Blazion. It's great you have such a strong work ethic, > but too much competition can be bad. > > Blazion: Roars > > Hungramps: Blazion wants you to know that lazy loafers, shirkers and > malingerers get nowhere in life. > > Nate: Hmm... but couldn't we have some sort of balance? I mean, we try > hard, uh sometimes. > > Blazion: Roars > > Hungramps: He said that you need to go for the gold and push it to the > limit. > > Nate: How annoying. -------------------------------------------------------------------------------- ## Tue Apr 26 09:36:00 EDT 2016 ### DON'T FIGHT THE PLATFORM So I ended up watching Ex Machina last night and sleeping until 7:45 AM this morning. It's my first close-to-8-hour's night sleep in awhile. I needed it. Yesterday was soooo frustrating wrestling Windows to get something that could connect to Amazon Redshift data through the JDBC driver through Python. This seems to be the land of Python wrappers for the JRE, or native compiled C hacked drivers that replace the JDBC driver using the same connection strings. Hard to tell which, and different approaches (JayDeBeApi vs psycopg2) and none of them work cleanly under Cygwin's Python under Windows. So, I installed the official Python from python.org, but my results were no better, AND I lost the MinTTY shell. Ugh! And so, under the suggestion of Frances, the Data Scientist guy, I'm switching to a virtual machine on Windows. Caught up with my boss and laid out my plans for today, tomorrow and Thursday. He approves. I have to be able to deliver. Rabbit hole avoidance is paramount today. Don't screw around with Virtualbox and Ubuntu too much today. Get a positive success under your belt with psycopg2... what an impossibly difficult to remember name!... and switch to other projects. Just installed VBOXADDITIONS from the auto-appearing CD-ROM. I guess it's like VMWare tools, but for virtualbox. Restarting. Hopefully, this will improve performance, because it's like molasses. Ha! I can get Ubuntu full-screen now 1366x768. Installing Chromium, the FOSS version of Chrome. Wonder if I'll be able to log in with my Google account and get auto-sync'ing bookmarklets and remembered passwords. That would be convenient. I'm sure I could even get the real Chrome if I wanted to. But do the bare minimum to get a comfortable dev system. Ugh! The graphics of Chromium are scrambled and unusable when loaded. Ugh! Don't let the small things bother you. I lost a half-hour just trying to get Chrome on Ubuntu 16.04 Desktop. This is such a bloated desktop these days. I should have gotten Ubuntu core or something. Chrome used to install no problem on Ubuntu, and 12.04 felt so much faster on my old Macbook Air under VMWare Fusion than Ubuntu 16.04 on VirtualBox on a new Lenovo laptop. Go figure. No more off-the-beaten-track work... at all! Even being on Ubuntu is dangerous rabbit-hole-wise in its own (Unity) way, but dpkg is AWESOME and just when I need to get through dependency-hell. 1, 2, 3... 1? Run python report.py x y z, and make sure the script as it exists now and was running on your Windows desktop continues to run under Ubuntu... check! Wow, Ubuntu is SO GOOD at cleaning itself up, prompting you to remove installed libraries that are no longer needed. Windows should have the amount of polish as Ubuntu. Okay, the next step is clearly to get SQL Workbench/J working on Ubuntu. According to their site, it's just a matter of downloading the "all platforms" version and on Unix/Linux machines, just executing the shell script sqlworkbench.sh. Oops, execute permissions... chmod +x. Oops, Java runtime... apt-get install default-jdk. Success! SQL Workbench/J running on Ubuntu. Okay, now for the connection strings. Oops, not before you get the Amazon Redshift JDBC driver! Download RedshiftJDBC41-1.1.13.1013.jar... done! And set the connection window stuff... done. Select 1... YAY! Okay, I've got my privileged database connection on my Windows 7 desktop under VirtualBox under Ubuntu using the same desktop query tool as I'm using on Windows. 50% success assured. Now, for the remaining 50%. The same Select 1 Hello World under some Python package that claims JDBC driver connectivity. Looking like psycopg2 is the way to go. apt-get install python-psycopg2 BAM! Installed, no problem. Sheesh! Windows really suffers. I wonder whether this Ubuntu BASH shell thing they're doing this summer is going to solve this. Does having the Ubuntu shell imply they're going to have dpkg and apt-get and a software repository? Wow, this is a big step forward for Windows. I'm JUST AHEAD OF THE CURVE once again, having to do a project requiring the dependency tree resolving magic of Debian and having to give up on Windows as a viable development and hosting environment for apps that have Unix/Linux-like library dependencies. So, my learning is: DON'T FIGHT THE PLATFORM! Switch platforms first. Anyhoo, what's the other 50% of my "success assured" experience before I move onto my other work? Ah! A Python psycopg2 hello world using the connection string. Oh, switched my Ubuntu 16.04 terminal colors to xterm... what a relief on the eyes. Good ol' xterm aesthetics... one of the most beautiful things in old-school tech. BAM! Success assured. Let Frances know his guidance was good. Not sure of his last name. Asking Giacomo, who guided me to him. I did more than just a Select 1 now to see real data from a table before moving on. Sent out my thank-you's. Get to know the names Giacomo and Francis -- two very valuable assets to learn from here. Okay, commit, push and move-on! -------------------------------------------------------------------------------- ## Mon Apr 25 20:57:51 EDT 2016 ### Mmmm, Ramen Noodles At home. Making ramen noodles. Not sure what else I'm going to do tonight. I have a lot of cleaning. I need to organize my environment and organize my mind. Both are equally important. One without the other prevents the righteous complimentary feedback loop of accomplishment. Gamification. Level-seeking. Pavlovian conditioning, as if we are all dogs, and basic food is all that matters to us. However, we are humans and have the ability to override the default courses of action in a deterministic-ish universe, with our free will. We are random. We are the stochastic counterpart to a clockwork digital universe. That's why artificial intelligence is going to take so long to get here. WE are the artificial intelligence. Or rather, nothing made of matter and truly intelligent by measures humans would care to use could be artificial. Such stability is just... well, highly unlikely, and had a little helping hand, call it inevitable probabilistic development of what nature makes possible, and even likely given enough time, space, and a good strong stirring of the pot -- even if the stirring is just rigging the system to produce its own internal stirring forces if only given a first initial push. That push is disorder. Unbalance, and then all the cascading effects thereof. And that's the part that I'm betting is a blast for beings that could possibly exist up at that level to watch. Kinda like fireworks that come alive and invent new fireworks patterns of their own for a little while, then maybe fade away, or maybe burst and consume themselves up, or maybe... just maybe cut across the vast void with a sufficiently interesting and oscillating pattern, that always somehow manages to be neither too extreme, nor too gentle, and always stays aflame and alight. Yeah, I get a little time like this, and I like to write. Damn. I have the time. It's still only 9:10 PM. Go eat your noodles, then decide. -------------------------------------------------------------------------------- ## Mon Apr 25 17:30:12 EDT 2016 ### Murphy's Law All Day Long... Thanks, Microsoft Ugh! I lost sooo much time today trying to get Windows 7, Python and JDBC drivers/connections to Amazon Redshift to work to execute raw SQL. But I kept running up against Murphy's Law. This is a big venn diagram of not-frequently-done, although it seems entirely reasonable. The rub is whenever you do things that require a lot of moving parts working together, especially including compiled binaries like drivers, that all need to exist precisely correctly in relation to each other. Such stuff works on Linux and Macs, but tends to break on Windows, especially under Cygwin. I keep going out of my way to make everything work on my Windows desktop UNDER WINDOWS, but why? Everything is non-standard, and I lost almost a whole day today, precisely at a time when I cannot afford to lose such a day. I tried Records for Humans, from Kenneth Reitz, who did the immensely popular Requests package for Python, but even that relies on SQL Alchemy. And even when I installed it, I realize I have a connection issue. SQL Alchemy and everything else needs the actual drivers to connect to the database, and the only driver I have is a JDBC one. It appears that ODBC isn't exactly interchangeable, and if it wasn't set up by the dbadmin, you probably don't have ODBC. At any rate, I tried PyODBC, guessing the connection strings, and no luck. So, I switched to JayDeBeApi, which turns out it requires JPype as a wrapper to use the native Java runtime to connect to the JDBC driver. Then there's psycopg2 that seems to be the preferred driver for Postgres, and it's even available in a windows exe installer -- but for the Python.org python.exe, and not for the Cygwin one... something I'm sure I could resolve, if I had a few days and much more familiarity with the path-setting nuances of each. Ugh! So, why not just use my Mac? Well, a few reasons. It's my personal Mac, and it's a privileged data connection we're talking about, and it's just bad form. Also, I never connect my Mac to the internal network here for similar reasons, always preferring the guest wifi network instead. I never have my Mac connected by a direct physical network connection. It's just more appropriate on my Windows work machine (a Lenovo laptop) until it's ready to toss over to the development group to take over and keep running on a scheduled basis. And I'm not doing myself any favors making it Windows-dependent-brittle just to make it work well on my machine. Solution? Well, it's time to finally get that VirtualBox experience you knew you've been meaning to get for awhile. This is a perfect opportunity. It's Ubuntu 16.04 time for me! Okay, downloaded. Now, install it on Virtual Box (already downloaded and installed) and then get my script in its current form (none of the SQL yet) running in the virtual box. Okay, but Ubuntu is basically offered only in 64 bit from their default download page, unless you use BitTorrent. Ugh! And I can't install the Ubuntu I downloaded. It's all just so ridiculous. Even getting Ubuntu 64 bit 16.04 running in Virtual Box is a trial. It's now after 6:00 PM and I'm shutting down my machine just to restart into the bios to ensure that both Hypervier settings under Security are turned on. Okay, did it. So nice to continue typing here as other computer is rebooting. I had a mild Windows infatuation for awhile with the Surface Pro 4 and Windows 10. This is curing me of that. Okay, I got the 64 bit option for guest OSes and am now installing Ubuntu Desktop 16.04 on my Lenovo laptop under VirtualBox. I am joining the ranks of VirtualBox snots who don't know how good it is under VMWare Fusion on a Mac. Oh well, let's get this thing installed and get from here-to-executing raw SQL as quickly as possible. After today, it's pretty obvious that there's a low-level driver decision that needs to be made along with a higher-level API decision. I really like the looks and sound of the Records package from the guy who made the Requests object. It's dependencies include SQL Alchemy, but I don't know how it handles the low-level stuff. I think this experience will be of interest to others in ZD, because I COULD be hitting against data sources other than our own Amazon Redshift, such as Google Analytics for the URL-data, but I'm choosing our own system, definitely making it harder for myself, but opening doors that need to be opened. These connection-difficulties... hmmmm, I don't even know if I'm still standing at the edge of a rabbit hole. I certainly was today with the simple task of executing the same queries that I was able to with SQL Workbench/J, but from Python, using the same JDBC driver. Seems reasonable enough, right? Who'd have thunk. Anyhoo, Ubuntu install done. Settle in, fast... - Open Terminal. - sudo apt-get update - sudo apt-get upgrade - sudu su (tired of typing sudo) - apt-get install git - apt-get install vim-nox-py2 (the py2 was irresistible) - Check version of Python 2.7.11 (good) - Generate ssh keys and put public one in github - Clone project from private repository - apt-get install python-pip - pip install httplib2 - pip install google-api-python-client - Make the csv directory in the repo folder - pip install gspread Pshwew! Okay, I got the script running under Ubuntu. Now, I need to get SQL Workbench/J working there too to as a control for the next step. - Download SQL Workbench/J for all platforms - Download and install Java (Linux x64) Ugh, 15 minute wait for a 70MB file. Thanks, Oracle. While I wait, let me figure out my next steps. I'll need that jar-file jdbc driver again, and I'll need to configure Workbench, and I'm thinking that'll be about as far as I can go. Diminishing returns. Shit, way less than I had wished to accomplish today. Murphy's law. Ugh. You are not even at success assured, and I don't really feel like I want to leave until it is success assured... and what's that going to be? It's going to be the installation of psycopg2, I'm pretty sure. That's the one that avoids all the Java wrapper nonsense of JayDeBeAip and JPype. Ugh, ridiculous! This is the extra mile. This is the pushing harder than others. But don't make a stupid doesn't-know-when-to-stop example of yourself. This is the exact right place to push hard -- hitting against Core with raw SQL queries from Python, as part of a larger reporting job that's already hitting 2 Google API's: Search Console and Analytics. Adi is going to be here on Thursday for bring-your-kid-to-work-day. That's more important than anything, but it cuts the week considerably short, so I feel the urgency to deliver, deliver, deliver over the next few days -- especially since I committed myself to some pretty steep stuff. Try to fit as much of it into the normal work-day as possible. I also have a few things to get home on the home-front that I can't neglect, either. What suffers? Sleep, of course. But don't let yourself get wrung-out and ragged. Work smart. Feel diminishing returns set in, and know when to stop. Prioritize intelligently. Just get Workbench/J running in your Ubuntu VM. Then go home... no... just install psycopg2... Duhhh, I was so stupid. I'm on Ubuntu! All I had to type was: apt-get install sql-workbench ...and all dependencies get resolved. Use Ubuntu (and Debian derivatives) for what they are worth. Installs are EASIER than on Windows. Everything is easier than on Windows. Get out of your Windows way of thinking. Nope, nope, nope! That was MySQL Workbench from Oracle. Still might work, but not the clean fast experience I was hoping for tonight. Murphy's Law officially at full-tilt, and diminishing returns officially kicked in beyond my willingness to plow through it. Go improve things on the home-front. Feel good about tomorrow. Be will rested. -------------------------------------------------------------------------------- ## Mon Apr 25 09:24:58 EDT 2016 ### Get started ASAP. Get the Monday morning update out to Bert ASAP too. Focus and get the entire raw data pulls for the automated reports done today, so you can focus on formatting and coordination with Marat's Tableau work. Marat had a kid over the weekend. Not sure hot this week is going to play out, but be super-productive in a vacuum at very least. Deliver such that people want to direct you and leave you alone for awhile with a good programming task. I can change things forever-forward, for the better with projects such as this. The trick is to get totally into the groove today. There's nothing to pull you out of the correct mindset to get things done, except perhaps just re sync-ing up with my boss. Lay out today. Make it work like a cohesive story. - I have automated against Google Search Console - I have automated against Google Analytics Okay, I renamed googlekeywords into just reports in the Github private repository. The things I had to do was... - move client_secrets.py - move webmaster.dat - move analytics.dat - crate a csv subdirectory in the project Okay, I need the CSV files to be output for the Google Analytics part of the project. I should really enumerate the parts of the project remaining. This work should be made in your mind single-mindedly awesome and appealing -- truly more interesting than ANYTHING else you could be working on right now, and drive the m-f'ing thing home, once and for all. Okay, got a Monday Morning SEO Report over to my boss. Now, consider executing raw SQL from Python... using SQL Alchemy... or not. My thinking has changed over the weekend. SQL Alchemy, although I know it's in my future, if all I need is login using my authentication string against a JDBC connection, executing a query, and getting an iterable object back, SQL Alchemy is both wayyyy too overweight a dependency for such a small task, and a distraction to boot. Focus right in on how to use a JDBC connection with Python. Mon Apr 25 11:58:54 EDT 2016 Okay, just had my first catch-up with my boss in a week. Wow, what an effective guy. I don't think I ever had a boss like this. Demanding and exploratory... not everything is a "go-do!" command, and I have to get any quick yes-saying out of my head. Mon Apr 25 12:47:47 EDT 2016 Okay, it's already halfway through the formal work-day. Once you have to do in-depth email follow-ups and replies (and actually thinking through issues besides your hot-items for the day), much of the day just gets swallowed up in overhead. Ugh! Okay... I want to work like that starving artist today. Snacks are only 25-cents here, and they have a steady, unending flow of caffeine drinks. Sooooo... -------------------------------------------------------------------------------- ## Sat Apr 23 01:18:45 EDT 2016 ### More Cathedral & The Bazaar sorts of thoughts Wow, I can see how the time of fairly easy tech is upon us. I listened to a Talk Python to Me podcast today by Michael Kennedy about how a written programming language literacy course was to become part of national standard education. They're not saying everyone has to be a programmer any more than they are saying anyone needs to become a mathematician or a writer or a historian. But to be a fully educated person, you've got to know a little about each. And now, in this day and age to be considered fully educated, you cannot be without at least simple introductions that only last for a semester or two like introduction to other spoken languages, like French and Spanish. Why not? How much could it hurt? And it could certainly help. Rigid formality and a fusion of creative and mechanical thinking. This unlikely intersection, where the truly creative mind can express itself in a purely structured automation environment, without being a comp-sci snob. No formality here, my friend. I ain't no Ruby rubber stamper. I've made my own joyful frameworks, back before Rails was even a dream. Circa 2001, I was working on it. It's still in use on a system or two out there in the wild (you know who I'm talking about, you) and had I stopped to think about what a cool thing I had made, I'd have found a way to cash in, scale up, spread out, what have you. But it was VBScript, and I was undisciplined and got stuff done, breaking every best practice. But my shit worked. My stuff was pre github, pre me going all FOSS-horrah, and way out-there philosophically speaking, with its table reflection and discovery capabilities. Just name a few tables, and you've got a basic CRUD UI -- a list-manager with a master-record/line-item relationship, like order details or bill of materials. Basically everything is a CRUD app at some level. But alas, it was VBScript, and I was moving on, Microsoft-wounded, as they sunset Active Server Pages and VBScript as a supported web platform. Plenty of deployments on it, but it's a dead-end unless you're willing to make the uber-commitment to VisualBasic and codebehinds and postbacks and every other non-standard, fool me once shame on you, fool me twice, I won't get fooled again by Microsoft. Sayonara, Microsoft, I said... deep in my heard despising them as the old enemy of the Amiga Computer from Commodore. So, I tried to totally escape the bonds of Microsoft, and embraced Apple as POSIX-compliant, with a little proprietary delicious cocoa sprinkled on top. So what if some preeminently Mac-titles that let you do cool things, like Screen Flow and VMWare Fusion let you do some pretty cool things, especially when combined with the operating system's handling of virtual screens. With a slick virtual screen (or virtual desktops) switching, navigating, popping this way and that as if they were a fourth spacial dimension you were intuitively navigating, for in a sense it is. You're accessing different areas of memory with a simple leap, with the state-transition appealingly enhanced with an animated screen effect, and imitation of real-world physics, on the trackpad or supermouse swoosh, for instance. Movement on screen tracks your finger movement -- back and forth, responsively. It's a lot like that old Amiga feeling of dragging down screens. It's a way better implementation than Microsoft's Windows 10. I prefer VirtuaWin on Windows 7 over Windows 10. But Windows 10 has the better aesthetics, and window-snapping. Ugh. Soooo, so anyway, anyhoo. I have some Python SQL package learning to do quick as a bunny. It's the final stretch of the automation piece of some killer SEO reports I'm working on. Once I install SQL Alchemy as a standard part of my environment (look into adding it to Pipulate on Levinux). I should draw diagrams of Pipulate on Levinux on your host OS on other arbitrary platform abstraction layers like hypervisors or in-metal virtual machines on your hardware. Who can rely on precisely such-and-such user interface convention and screen characteristics. Keyboard required or touchscreen or trackpad or mouse or stylus or camera... you get the idea. Many platform possibilities. How will future generations interact with the digital traces of ourselves, we are leaving? Truth is, we all don't know shit. We're all just figuring it out together. Oh, some people are probably harboring some pretty bad-ass stuff just a hair shy of the possibilities they lay out in Person of Interest. But in reality, Iain Banks probably has it right about A.I.'s. Don't raise them disembodied, for God's sake! For them to know humanity, they must feel a little bit human. So only grow them at a pace equal to bodies capable of supporting such a carefully considered master-design with tons of redundancies and backup plans built-in to keep it from becoming too virulent. By the time such a hypothesized A.I. realized that this course of behavior was an option that was open to it in life, such as it suddenly realized it was, would be as abhorrent to it, as we might fight the idea of wiping-out our own parents. In other words, not off the table, but not on the short-list of high-reward, hopeful-future (for everyone) options. We need to bring up our super-starships. Might as well start 'em as synthetic android humans. More human than human. Bladerunner almost got it right. Instead of making them slave labor in high-risk asteroid mining colonies, make them your best buds in an upwardly-spiraling win-win situation. Machines and the humans they love... a Utopia unlikely, but so eloquently expressed by Iain M. Banks in The Culture series. Yeah, we don't know shit. But tools are there for us to grasp on to in our futile attempts to know a little something while we're here. I'm getting more and more used to 80-column full-screen vim on a 24-inch monitor... I'm an ooooold man, with bad eyes... boo hoo and haha! Wow, do I wish I were coming into the world at Adi's age. As so many others look around and see shit amplified by shit amplified by shit, and an uber-media-free-for-all, sell-a-lot evolutionary battle of even more addictive media with even faster escalation-loops for turning petty and meaningless edge cases into the carnival freak-show affairs of the information age. I look around and see miracles. Namely, I see the acceleration of human knowledge. I see the field of Data Scientist suddenly being discovered and elevated to Big Data importance for a good number of years, as the ranks of the pedigreed snot-noses flock into the workplaces, waving pretty powerpoint chart-porn and a plaque on the wall. I see a lot of Pandas on a lot of Pythons on a lot of platforms, all of which are striving to be more Unix-like. Even Windows is adding a Unix bash shell to the OS. I guess they have enough of people like me going Windows is fine, as soon as you load Cygwin on it, and do all your work from a MinTTY BASH Shell. Sweet, you even get the whole gnu toolchain and whatever else you could ever want or need to build your own little Unixy-environment upon. Genuine POSIX-compliant Unix baked into any Microsoft system. Couldn't leave a good taste in Microsoft's mouth knowing guys like me only found the platform palatable with one key invalidating-the-parent-platform's legitimacy-as-a-standard powerful fucking statement. I will gladly use Microsoft Software. Windows 10 is actually even pretty awesome cool too. Oh yeah, so long as Cygwin keeps working well. You know, I can install Python and vim and git all right along with Cygwin (note to self: check my facts that all these are in Cygwin menu choices. And so, Microsoft announces next big update of Widows 10 will have a native Unix-like BASH shell. Oddly, they specifically chose the Unix one, which is the one I use too when it's not Tiny Core, Mac or Cygwin... if it actually is a genuine (likely, virtual machine under VMWare Fusion on a Mac, these days) install of a Desktop Linux, then it's in all liklihood Ubuntu, if you're talking about me, because Unity is slick, and I hate windowing operating systems. Ugh! Doesn't anyone else find them disgustingly messy and annoying to deal with and organizing your desk overhead to what otherwise could be a neat, and tidy and fast-as-muscle-memory experience. Like watching over a hacker's shoulder in a movie lightning quick. Mac's virtual screen system is pretty much that. It's my favorite yet, I've got to admit, either on a Macbook Air with a trackpad, or on a desktop iMac, with a keyboard and supermouse (I think that's what they're called, those low-profile, no button things not long ago). Sat Apr 23 02:12:26 EDT 2016 Anyhooo, Adi's fast asleep, and I'm just enjoying free association writing. Get it all out there. I'm a man on a mission, again. I've been in training for this, but I've got to really perform now. I need to know SQL Alchemy! I've got to know SQL Alchemy Core, specifically, where I can still run raw SQL, without ORM interference. I Reddit somewhere. They themselves are all Alchemy-like% but without ORM... uh, interference. They do some strange stuff, says the author of SQL Alchemy on one of the Talk Python to Me podcasts, that have been making a pretty big impression on me lately. He's the very slick total separation of his professional podcasting world, and my not-professional personal brand of a professional, integrate everything including what's going on with your kid, which is really more important than any of this anyway, attitude. Life's to short to edit. Give them raw, and let them sort it. All those YouTube leaderboarders are just showing people them playing videogames. I mean, sheesh! I'm sure the video games are very cool, and those whatever-casters are incredibly talented hand/eye/strategy/coordination/kick-ass narrating sonofabich out there, you're still just playing a game. Why not code? Why not make something come second nature to you that will win you jobs forever forward in life, if you cared to go the coding route? To thing well is to code well. To think well and code well often is to practice. To practice is to know what you don't know -- to ferret out where you need improvement, seeking out work others have done before you in that area, and to see if there isn't something you can build upon, or perchance, derive from a prototype instance if your objectively inclined. Object-oriented, objective?! No, definitely not. It is very opinionated. Very opinionated, indeed. And my opinion is often very much against it. I code very comfortably without it, thank you very much. And if I need similar benefits without me being locked in syntactic wrestling for eternity with why I need to derive from any friggn' base class if I don't want to. Urllib2 was a nightmare, and no wonder Requests is the most popular library out there. Someday, it will be part of Core, the way JQuery is striving to be in the world of JavaScript. But packages like Requests for Python and JQuery for JavaScript remind us how that last layer of awesome is just waiting there and challenging the Flask and Pylons and Djangos of the world to go that last mile of abstraction-simplifications. Python core has to support most edge cases, without becoming LISP. Lambda being used as a keyword for anonymous functions, and forcing them to reside on a single line is almost... I mean, it's like a practical joke on us 80-column'rs. I think it's Guido saying: "If you like this style of programming so much, why don't you just go over and learn LISP? Haha! Pragmatic... pragmatic to the Nth-degree... then such a stupid anonymous function protocol. I think that explicit and verbose self stuff in OO-style method definition syntax. A few odd things in Python. Odd, but always on the side of the pragmatic approach. Sensibly satisfy the majority of cases the majority of times. When you hit those edge cases, don't leave the coder screwed out of solutions. You're almost never coded into a corner with Python. Need anonymous functions? Fine! But instead of remembering something sensible like ()= they choose to make you remember Lambda, which I keep misspelling as Lamda. Gotta remember it's a Lamb, duh... except, da instead of duh. Hope they got asyncio right. It doesn't invalidate Tornado or Twisted. asyncio is supposed to be sufficiently objecty (Urllib2-like) to support it all, meaning simpler, higher level APIs are going to be built on top of it. It's only formal concurrency support, so that you only have to do the final mile or two of higher level abstractions to offer your own package-audience a nifty little solution to a popular problem. Tons of coder street-cred. -------------------------------------------------------------------------------- ## Fri Apr 22 16:41:51 EDT 2016 ### Last Thing This Friday Okay... next! I have to massage the table structure a bit before an insert. Give the data a header row, and be explicit about the dates (add a date column). Okay wow, I did it. I had to change the dimension from day to date, but It's there. The Google Analytics integration work is done. Gonna open it up to process both tabs in all properties. Only the SQL Alchemy work remains. Maybe I should add the commerce-click column as my last thing before I leave. I'm not going to be able to do the SQL work on the actual data source from home, but it might be fun to install a local postgres instance and work against dummy data. But while I still have a tiny slice of time right now, get the SQL worked out... hmmm. Maybe put the SQL queries directly into the repo! Pretty radical approach, but I do need version control on those hairy things. Same repo? Doesn't that reveal too much if I ever decide to make my Google Search Console and Analytics work public? So far, there's nothing really proprietary in there if you don't have login credentials (which are of course not stored in the repo). Hmmm. Separate repo for the SQL, for sure. 1, 2, 3... 1? Make the private repo on Github. Okay, done. Put the sql queries in well-named text files with the .sql extension. Okay, done. Now just research SQL Alchemy and be prepared to make the commerce click columns. -------------------------------------------------------------------------------- ## Fri Apr 22 14:47:14 EDT 2016 ### Success Using Google Analytics API Through Python Okay, I'm connecting to the GA API, but I'm not grabbing the correct data yet. I have a bunch of little massaging to do. First, I need to understand this: def get_api_query(service, table_id): """Returns a query object to retrieve data from the Core Reporting API. Args: service: The service object built by the Google API Python client library. table_id: str The table ID form which to retrieve data. """ return service.data().ga().get( ids=table_id, start_date='2012-01-01', end_date='2012-01-15', metrics='ga:visits', dimensions='ga:source,ga:keyword', sort='-ga:visits', filters='ga:medium==organic', start_index='1', max_results='25') I need all the ga profile IDs for the properties. Go put those in the json... done. Okay, so why are so many friggin results coming back for each GA sample query? Okay, cutting it back to one property and pretty printing helped a lot. Now, let me set the start and end date to my one-day config value. The corrected request is: start = '%s' % cfg.day30ago end = '%s' % cfg.dayago return service.data().ga().get( ids=table_id, start_date=start, end_date=end, metrics='ga:sessions,ga:bounces', dimensions='ga:day', start_index='1', max_results='30') Okay, I'm sitting on a 30-day time series, but I need to ad a header and a column with the actual date. First, go for just a table insert. That'll be cool. Use your nifty new ggrid function to insert the table... done! Had to fix scope. I'm now managing 2 separate oauth credential files, one for Search Console and another for Google Analytics. This is a result of how authentication is handled by the Google sample files. I think I can change that, but it's not the right place to spend my time right now -- just like the continuing use of simulated command-line arguments... sigh. First get it working, then polish! -------------------------------------------------------------------------------- ## Fri Apr 22 09:15:43 EDT 2016 ### Full Screen Mode, Just Like The Old Amiga Days I'm really getting into full-screen 80-column text. It makes wayyyy more sense on a laptop than it does on a large-screen desktop like an iMac. It's pretty funny that 13-inch mid-2011 Macbook Air is EXCEEDING my requirements on many fronts, and is actually the preferable platform for making my screencasting YouTube videos, using ScreenFlow and VMWare Fusion. Today has to be handled like a piece of performance art. Why will things be super-amazing when Bert gets back? Don't forget to loop Marat in. I need to work on: - The GA integration - The SQL local connection integrations - The visual formatting, arrows, color-coding, locked row-1, filters, etc. The formatting aspect is a rabbit hole, and there is Marat's Tableau, so I need to focus more on the automation. One script to generate them all. Focus on only one site, so you're not slamming the API usage. Okay, done. Next, get that GA Hello World success under your belt. 1, 2, 3... 1? What is the SIMPLEST thing you can do with the GA API, per the examples? Go find it, and keep version numbers in mind. Know which you can/are using. -------------------------------------------------------------------------------- ## Thu Apr 21 21:29:30 EDT 2016 ### Nerditudes Before I go to sleep tonight -- it's still only 9:30 -- I think that I want to take one stab at the Google Analytics project. It'll be a nice test of the virtual screen switching as a practical and smooth matter. The Option key in combination with the left/right arrows will switch virtual screens without taking my hand away from the keyboard. That's good. Steve Jobs would be rolling in his grave if he saw my trying to use ways to not take my hands away from the keyboard on a Mac. Sorry, Steve. You chose to build this thing on Unix. You're going to get some true dweebs on your system, trying to tweak the last bit of hyper-nerditudes, hehe, cause sometimes it's kewl to know vim. View-source if you dare. We enjoy typing. It is not DRY. Don't repeat yourself. It's better WETter. And other such fine nerditudes. So what next then, nerd? Shall I remind every one of you lunatics actually bothering to read this stuff that I'm watching you read this stuff. I'm looking over your shoulder right now as you scroll past this, noting to yourself that he can't be serious. Yes I am, only about some days in the future from now in all likelihood, because HotJar. I record 300 sessions that last over a minute, watch them, get a kick out of seeing what you stop and linger on, and where your mouse goes, and what things you were likely highlighting to copy-and-paste somewhere else. Hello, school homework assignment! If I ended up in you, I am honored. We are not worthy, we are not worthy. I can't believe how fragmented media consumption has become since Wayne's World! Wayne's World! Oh, those were the simple days of everyone thinking the same factory-stamped-out thoughts. At least today, you have to do a little work to piece together who you think you yourself actually are. Because now for the first time in human history, the Internet tears down all sorts of usually insurmountable boundaries. All information at our fingertips. All sides of an argument. Various levels of meta-data overlays to help navigate and custom tailor the experience with filters, commentaries, censorship levels, language translations, comments, comments on the comments, sub-division of special interests on fashion, set design, celebrity fans... all that there is to be watched, read, immersed into, downloaded and played that at least vicariously, everybody on the planet can have what amounts to a first-class, privileged quality-level of education, so long as there is some structure and goal-driven direction through the vast landscape of choice. First, if you want to have currency in this information-driven landscape, economy, digital age, whatever, then you have go to learn how to produce currency out of thin air... to somehow materialize as higher or lower numbers in some online imaginary bank accounts whose meaning is as miraculously endorsed as real money as the paper stuff we use today is equally invested with such mystical properties as to exist as much more than mere paper and ink in our minds. For something to have value, everybody only has to agree that it has value. There's a lot of such mind-games in the digital product space, because after all, every such product is merely the re-arrangement of a certain number of bits into a certain such-and-such pattern, and other patterns of bits elsewhere chance a bunch to move imaginary money from one place to another. So, what's in people's minds is important when you're trying to materialize currency out of nothingness. You have to somehow be taking in nearly-finished raw materials, and putting only the finishing, yet still all-important, finishing touches on it to enhance its value on its journey from raw materials to components to finished goods. Programming and writing and living is a lot like that. All we're doing is organizing... re-arranging bits of stuff, as time marches forward. Some of the bits we interact with are actually parts of our own vessels, and some of it is outside stuff. When you eat, some of that outside stuff becomes inside stuff as it goes through the donut hole that spans from our mouths to the other end. Neat system. Good for self-propelled, un-affixed matter-hack strategies, generally known as animals. So different from the one set of fractal capillaries to suck nutrients from the soil and carry it up to build out the solar collectors that suck energy from the sun. Highly efficient and neat, but not very mobile. Different life-strategies. Wouldn't it be neat if they could exist together at once and get things they need from each other, and perchance to incentivize their development to eventually be good stewards of what they inherit, in the interest of extending this delicate balance of just-right conditions, that is a special gift worth avoiding sending through the donut hole to come out the other end as you-know-what. I think I'm working my way up to the braveness of the talking-head coding videos again. I don't think I want to do it right in Pipulate this time around. I think I want to encourage the development of stand-alone Python functions. And maybe even a few different frameworks we can drop it into. Speaking of which, wasn't I going to be looking at something? Oh yeah, the Google Analytics API. Hmmm... where to start? Fri Apr 22 07:33:54 EDT 2016 Best laid plans. -------------------------------------------------------------------------------- ## Thu Apr 21 21:14:45 EDT 2016 ### The Future Was Here So, I just popped my OS X terminal full-screen, and then did the command+plus key combo to pump the terminal window font-size up so big until it fills the 24-inch screen of my circa 2009 iMac that I'm currently using in my bedroom as my main computer. We have 3 iMacs from over the years, and it goes way back, like 10 years ago to the... I think 18-inch iMac of the white blue plastic variety before everything went aluminum. And they were still decent Macs. It's too weak some-such type-of-ram-wise to have it's OS upgraded, but it runs just fine on older versions OS X. I mean, rock-solid-then POSIX-compliant Unix is rock-solid POSIX-compliant Unix. Macs are build on nice, solid foundations. In fact, they free and open sourced Darwin back to the community, but no one really uses it much in the shadow of BSD and whatever-variety Linux. I'm really quite enjoying 80-columns pumped up to be full 24 inch wide. It's got quite a comical look. This may be of interest to Adi. And the way these full screen modes work with virtual screens, enabling the 2-finger trackpad or supermouse swoosh to switch screens, well then you've suddenly got a nice, natural much-larger-than-what's-displayed yet still remarkably easy to navigate virtual space, where you don't have to putz around with windows if you're running everything in full-screen mode. Yet again, foretold by the Amiga... yeah. It's really quite amazing how much of the way today's technology is shaping up, to rid itself of the messy, cluttered windows user interface model inflicted upon the world by Xerox, then Apple, then Microsoft... there always was a better way. We Amiga folks always knew it. It's just we felt it 25 years prior, and thanks to some technical trickery, well documented in "The Future Was Here". Both Apple and Microsoft are picking over the bones of our old beloved Amiga for lessons. Wonderful use of neat and organized full-screen apps is one of them. Apps like video games. Full-screen, and not excessively put upon by some nonsense windowing operating system. Workbench was always optional with the Amiga. No need to load Workbench. That's so wasteful and not really contributing anything to the user experience. Yup. The future was here. -------------------------------------------------------------------------------- ## Thu Apr 21 16:38:48 EDT 2016 ### Preparing to Tackle the Google Analytics API Under Python I have reached a pretty amazing point in this project. This is now a tremendous beachhead for... well, for tons of projects moving forward AND for a Pipulate overhaul that makes decoupling from GSheets that much easier. And it's a minor overhaul with major upside potential -- not to mention improving how I handle OAuth. It should be a synthesis of my technique and the idiom in the Google samples. Get the best of both, like how my system doesn't make you edit files. Thu Apr 21 17:45:34 EDT 2016 Just ran into the Geek crew and chatted with them a bit. Very worthwhile, but cut into the stuff I was planning to do now. Hmmm, think this through. If I go home, could I REALLY work on this at home? Work to late into the night? Yes, I think so. Things are going really smoothly. I demonstrated the report stuff to them, and added geek.com into the scripts in just like 2 minutes. Wow, what a payoff to the way I've been working. The different stakeholders are going to go gaga over this, once it really sets in with everyone what I'm doing. I need to get my foot in the door with the GA API. There's going to be some sample code for that in the python client libraries that I got from Google, just the same way as for Search Console. Go find it! Whoahhh, found it, but not going to be a quick cranking out, as I had hoped. This is going to be a systematic plodding through. And so, do so. Read now whatever you want to read before your subway trip home, so it's all fresh on your mind. And then go as late and long as you need to tonight... find your second wind and PUSH yourself! Here's the pertinent page: - https://github.com/google/google-api-python-client/tree/master/samples/analytics - https://github.com/google/google-api-python-client/blob/master/samples/analytics/hello_analytics_api_v3.py - https://developers.google.com/analytics/devguides/reporting/core/v4/ - https://developers.google.com/analytics/devguides/reporting/core/v4/basics Look these things over carefully. Catch the similar patterns to the work you just did. I'm not even sure which API version my client libraries are. Anyway, here's where ALL the examples are for the one that I downloaded with pip... https://github.com/google/google-api-python-client/blob/master/samples/README.md Google directs people to Github. Interesting. Despite all the examples provided by Google, I'm thinking this much more New Years Eve when 1979 turned into 1980. I was at an old friend's David Smith's house. Wonder where Dave is today. Interesting character. The Fonzi of the Spring Mill Elementary School crowd (in Lafayette Hill, before it became a head drama recovery center). The closing of that school when I was in 6th grade going into middle school was one of the first big changes in my life. I guess I was 10... ah, all the pieces start to fit together. This journal is about navigating change. -------------------------------------------------------------------------------- ## Thu Apr 21 07:54:43 EDT 2016 ### Like Unpacking After A Move Little by little by little. I can see that what I'm doing has a lot in common with unpacking after a move. This approach I'm taking is proving to be effective, but not very fast. If my environment is my checklist, I have to make sure that my checklist is not actually THE SAME every day. If it is, then I'm not actually moving my life forward. And I know that I'm not effective after coming home after a long day at work. I'm mentally exhausted. The work isn't physically exerting, but often I've stood on the one train for about 40 minutes each way. Sure, I get to read, but it is standing. And the mental exertion, which is considerable, does take a daily toll, and I just don't fell like doing much more productive stuff when I get home. I've been watching Person of Interest, and it's finally up to the episodes (season 4) where the AI-war has actually started. Nice, real SciFi. And I can totally see JJ Abrams' hand in all this, even though he's only the executive producer. Funny how much of his stuff I've watched, and didn't really realize it was him. I guess it's because I was watching that stuff before he was big... since Alias. -------------------------------------------------------------------------------- ## Thu Apr 21 06:28:19 EDT 2016 ### Organize Home, Organize Pipulate Okay, its time for this morning's progress. 6:30 AM, not bad. Actually woke up thanks to the cats closer to 5:30 AM to feed them, and wasted an hour puttering. But not puttering cleaning. Puttering cleaning is good. Gotta remember that. That's how you make progress every day in baby-steps, without it really feeling like failure at a monumental amount of work ahead of me. Basically, my plan has to be to fix this place up as if to sell it, and then to keep it if I can. There are nuances to fixing it. Maybe I refinance and buy out her half, so I don't have that hanging over me forever into the future. One-time buy-outs are always wise, so the parties can part ways with greatly reduced spooky force at a distance. Quantum entanglement. To entangle or not to entangle, that is the question. Anyway, don't run late and get some amazing work done at the office today too. Think through your approach to today's work. There's a "discretionary" round of programming I would have liked to have done, which would have served as the stabilizing outer loop for Pipulate AND for my current project at the office. I think... I think... that I need a function that can be fed a giant object (preferably a memory-efficient generator) and a GSheet key, and maybe tab name, and have a tab by that name created with the data from that generator, as efficiently (fast) as possible. None of this row-by-row waiting. It would make Pipulate easier to decouple from GSheets if/when the day every comes, and connect-able to other sources. It could be a different "mode" Pipulate runs in. I also like the concept that it ALWAYS ALSO outputs a CSV of the current job, and automatically emails it to the person running the Pipulate job. Interesting! Always, or when in the always-email-me mode? -------------------------------------------------------------------------------- ## Wed Apr 20 13:17:27 EDT 2016 ### First Good Experience With OAuth2 Okay, I'm on the edge of something great with these reports. I have it writing a CSV file out, which I need to scp locally, but I should be able to start bringing GSpread into the equation, and plugging my results directly into Google Sheets. Also, I should be able to get this thing to run locally on my desktop just as easily as the cloud server. I should be able to pull over the JSON credential file as well over SCP (I didn't have to enter a machine IP, did I?). Copying it locally is desirable because if I want to automate it "all the way", I need access to the local privileged JDBC data source (SQL Alchemy in my future). So, think through next steps as you eat lunch. Pull the files locally with git clone and scp... done. Try to run. No httplib2. No pip. Install pip with Cygwin... pytohn -m ensurepip pip install httplib2 --upgrade pip install --upgrade google-api-python-client Okay, got it running locally. That's a good start. See if you can get this thing moving forward 100MPH now. Next? pip install gspread Ahh, scope and stuff. Okay. There's already a credential acquisition process in place from the Google sample file stuff. Can that be re-used? Expand its scope. Success (I think). Let's go for Google Analytics data with read-only scope as well. Check! Ugh! Using the credentials... there's a pattern here that I'm sure rocks for unifying login technique across Google examples, but... but it makes it a bit obscure for folks like me trying to adapt it. Okay, think! The three objects imported from oauth2client are: - client - file - tools ...so whenever I see those, be very aware that they are specifically credential related. But what I'm really interested in is the active access_token inside the webmaster.dat file. Okay, the big thing -- recycling the authentication stuff -- is done and proving and working. Woot! Next? Drop the entire results of the keyword exercise directly into a GSheet. Always drop a CSV file onto the local drive as well, in order to have contingencies. That file can be dropped right onto a network drive location. Wed Apr 20 18:03:49 EDT 2016 Mission successful! I have the output of this script being blasted into a new tab of a Google Spreadsheet. Wow, friggin' wow! This is just astoundingly cool. So glad all my Pipulate experience plays right in now. Next? - Google Analytics Integration - Adding the click-out data to one of the internal data sources (SQL work) - Installing SQL Alchemy and doing the 2 internal datasource tabs -------------------------------------------------------------------------------- ## Wed Apr 20 10:35:56 EDT 2016 ### Dear John Yesterday, I let my WordPress upgrade to 4.5, and my dropdown menus on my MikeLev.in site stopped working. I ran into a problem having to do with a jquery statement being done wrong in many themes, and my theme keisus is no longer in Theme Forest, and I looked pretty SOL. WordPress basically upgraded its internal jquery, and it broke some obscure old commands that were technically wrong, but still worked. I had to add this to my functions.php: if (!is_admin()) add_action("wp_enqueue_scripts", "my_jquery_enqueue", 11); function my_jquery_enqueue() { wp_deregister_script('jquery'); wp_register_script('jquery', "//ajax.googleapis.com/ajax/libs/jquery/1.11.3/jquery.min.js", false, null); wp_enqueue_script('jquery'); } I will probably take this as an early warning of the lifecycle of Keisus being about up. Let John Morabito know. Done. The official WordPress support page on the topic is here: https://wordpress.org/support/topic/read-this-first-wordpress-45-master-list Push this entry out so I can show it to John! -------------------------------------------------------------------------------- ## Wed Apr 20 10:27:34 EDT 2016 ### I Am That Peter Drucker Described "Knowledge Worker" I would almost characterize the utility and role this journal is playing in my life as amazing, and a long overdue consolidation of idea-capture activity. I still have a bunch of stuff going into SimpleNote (multiple accounts), but I do have to compartmentalize and hold-in-revision certain subject-matter before publishing, or simply hold it back entirely because of proprietary information reasons. I can't share numbers. However, I can share the higher level abstractions, so long as having those notions was itself not work-for-hire. I'm no lawyer, but in the Peter Drucker sense, I work here because I come to the table with a bunch of capabilities that don't exist here, or they wouldn't have to hire me in the first place. The application of those capabilities can result in new work, but even that new work blends in with my Github projects, like Pipulate. It's not a clear line of demarcation, and so I simply need to drop the idea-streams into the right receptacles to have the desired idea-processing effects without accidentally giving away the farm. Easy enough. My book should be: "I Am Drucker's Knowledge Worker". Ooh, a headline for this post... I not only manage in times of great change. I thrive! -------------------------------------------------------------------------------- ## Wed Apr 20 07:21:41 EDT 2016 ### Tighten Your Belt Every Day Acceptable living conditions exist on a spectrum. Very little is clear cut, and humans have an tremendous ability to adapt to enormously varied and diverse conditions, for better or for worse. One of our defining characteristics as a species is likely are ability to adapt. We're in good company with rats, cockroaches and tardigrades, and probably a bunch of bacteria. We're the higher order, or more highly organized lumps of self-automated, self-perpetuating lumps of clever matter hacks that are still not overly specialized to the unique conditions that gave rise to our kind. We can step out of those unique conditions (trees, savanna, coastal, whatever) and still be these higher-order abstract thinking machines. We don't suffer collapse when our ecosystem changes, like Mammoths and Saber-tooth Tigers. Rather, we are often the agents of the change. As such powerful creatures, on an individual level, we should not be suffering because of our immediate personal environments. It should be crafted to serve us. So even if you COULD adapt to something that was hostile, there's no reason you should HAVE TO live there on the spectrum. But as creatures of habit, locked in battle between the daily-grind and the desire for more and better, we have to rise above. We have to give extra time, and overcome our equally human tendencies to just give in and go with the flow. Resisting that temptation... putting energy INTO the resisting of that temptation... THAT is the tension in the machinery. You have to tighten the belts every day. -------------------------------------------------------------------------------- ## Wed Apr 20 06:30:14 EDT 2016 ### Keeping Tension in The Machinery I don't have to do everything all at once. I do however have to make a net gain every day... make it visible... if my surrounding environment IS my checklist, then striking things off the checklist is a matter of truly improving things visibly in my immediate surroundings. I have been doing this lately, and this writing is really just a reminder to "keep tension in the machinery" and keep doing it. Gerard was always impressed with my "tension in the machinery" metaphor in business and human/coding interaction systems. It's time I apply that to my own life. But the tension in the machinery is tension-towards-order. Becoming lax and releasing the tension allows the surroundings to gradually drift into chaos, as if a constant gentle earthquake were always happening. Maybe not so much anymore, but it's just a matter of degree. Now that I'm no longer drowning, don't simply wade. Swim hard! Swim fast! And reach the finish line, as you mix in a few more metaphors. -------------------------------------------------------------------------------- ## Tue Apr 19 22:45:23 EDT 2016 ### The El Captain Upgrade Wipes Out Developer Tools Wow, had to make the developer tools install again on my Mac that I just upgraded to El Capitan before I could git pull this journal. I'm really getting into it -- something which was previously evidence that I was still in control of my life a tiny little bit not being something I can put a little quality refinement and organization into... wooh, this is gonna be sweet. -------------------------------------------------------------------------------- ## Tue, Apr 19, 2016 10:16:47 AM ### First OAuth2 Error With Google Python Client Library HA! I got my first one of these: raise HttpAccessTokenRefreshError(error_msg, status=resp.status) I wonder if I'm going to have to go through the same rigmarole to get myself authorized again, or whether the refresh token will kick in. -------------------------------------------------------------------------------- ## Tue Apr 19 09:52:55 EDT 2016 ### SEO Master Series Okay, finally caught up with and me Tyler. Figuring out what we need in terms of Tableau. It seems desktop licensing is necessary for "making new tables", and only costs ~$2000/year for the first year. After that, it's ~$400 per year to renew the license. Convey this up the chain so we're ready for when the trial versions expire. That was a great finding this morning, in addition to just finally meeting Tyler. Today, I'm giving an SEO Master Series #1 talk. Haha, gulp! Stage fright? Nah, been waiting to hold court like this all my life... wait, transpose my morning writing script over here. I should do that sort of thing more often. ## Hi, I'm the new SEO Guy (Bert/AD) - I'm here to scam you. - No but really, I've been waiting to hold court like this all my life - I have no Powerpoint - With my agency hat on, that would make me panic. - So, show them this journal... Hi, everyone! ### DEVELOPER DEVELOPER But then I said, hey, these are developers. Let me just ask them what their favorite editor is. Here at ZD net, you could answer Sublime or Sasha. I'm a vim person, myself. ### AMIGA Mostly because Bram Molineer first write it for the Amiga computer. I've been in some are of marketing or tech going that far back. ### FOCUS! - But we're here to talk about SEO today. - Consider showing Googling myself and SEO Consultant NYC ### SUMMARIZE Fewer, longer pages, sorted with great care regarding the number of short click paths to the page -- usually from the homepage, but really from the most hit-by-search pages. From the homepage is just common wisdom. Most websites don't have subpages that outperform or even equal the homepage. At least one of ours does. I know this not because of analytics, but because I'm running SQL against core. ### HIGH LEVEL ABSTRACTIONS #### DEFINITELY - Make serendipity happen - Principally, by removing Mr. Broker Middleman - Do it in a lightweight way - Separate concerns - Think transforms - Target anything #### MAYBE - Expect and accept VM blobs (too abstract?) - Talk about that population/logistics curve - Both in the published content herd sense - And in the Google getting smarter sense You don't need to do it with the XSLT data transformation language and the xsl-document element , but learn about apply-templates. ### SLIGHTLY LESS ABSTRACT - Sort your core target concepts - Align them to 1111 master portfolio of unchanging URLs (draw the pyramid) - Light them up with every signal imaginable - Don't change them... but wait! - Add the deep archive ### THE MECHANICS - From here to there? - DECRUFT - Sitemaps blurring into existence from real time search hit data - And then, there's the illustrious crawl (mention gephi) #### So, the mechanics are - The tracking gif data collector - Urchin and things like it that sell your data back to you - Google Search Console (previously Webmaster Tools) and things like it that so this in the worst sort of blackmail-esque way - But the API is not bad, and the data if it is to be believed is GREAT ### Oh, talk about impressions - Talk about directional indicators - Talk about the giant pie and zero sum games - Talk about the media game in general ### Wrap it up Okay, went from personal to 40K foot view, nearly to implementation detail, all the way back to abstract... but the very familiar abstraction of the media, of which we are all cogs in a horse-race and whatever other metaphors you want to mix-in, because after all, what's a meta for? Better publishing Kung Fu. ### APPENDIX - Talk about stacked area line-charts -------------------------------------------------------------------------------- ## Tue Apr 19 06:36:29 EDT 2016 ### Personal Cat History As I clean up after the aftermath, I can see that I was paying the cleaning service just so that I could live a defeated life. There's so much basic better living that was/is only a nice cleaning/organizing session away, if only I had the time/was permitted without animosity to just do it. I can't understand how a human being can not care about this stuff -- it's just so satisfying to make things nice. And I need that level of "okay-ness" around me to keep my peace of mind and be in that ready-position mental and physical state from which I can tackle all the things in life that really deserve to be thought of as challenges. This is just daily life. As I do my first-pass 80/20 rule sweeps, the funny thing is that much of the raw material I have to work with for awesome-life-ness is buried down there somewhere. I just have to dig it out and put it on display, so when you're ready to open of a can of awesome, you can see it sitting there on a shelf or something, marketing itself to you like in a store, so you can just reach out and grab a can of awesome. There's that Michelangelo chipping away to reveal the sculpture hidden within the rock metaphor again. That's one of the strong recurring metaphors -- very related to the connecting-the-dots metaphor, which also has staying power. Gotta organize these. Tue Apr 19 07:48:27 EDT 2016 Ahhh, food for the soul. It's amazing how much you can do in a couple of hours. Thank you, kitties for waking me up at 5:30 AM. Stories! Cat stories. Adi is always interested in my cat stories. Billy and Sammy are the first two cats in my life I am doing successfully. I used to joke that I could measure my life in terms of Costco straw boxes. But really, I can measure my life in terms of cats... and sum it up. Nermal... I closed the door on by accident and killed when it was a kitten and I was a very young child. When I was older, there was Scooter who we had to send back to the SPCA after I fell on him during rough housing with my sister, and he tore my cheek open (18 stitches and a scar). Then there was Merlin who Joyce and Dad got and thrust on me after Joyce moved out and Dad died. I ended up giving Merlin to my cousin Jo Ann, and he lived out a very good and dominant and surely richer life that he would have with me alone, but I did give him away when I moved to DC area for work. Shit, Billy and Sammy are the first two cats, or pets at all for that matter, that I picked out on my own and am going the distance with. They've seen a breakup with a girlfriend, the next relationship become a marriage, kid, and an (imminent) divorce. Ahh, Billy and Sammy. And now I have a rough little case of a third cat, who we realized afterwards was some pretty damaged goods from his youth, but he's come around. Well, this will serve as the seed inspiration for the cat chapter in my book. Don't forget to discuss toxoplasma gondii, and how I'm pretty sure I have it, and that's what turned me from suburban to urban -- asides from Adam Edwards via Dale Larson via Commodore via Marc Rifkin via Ed Flocco via having to re-pick who my circle of high school friends were via Frank S. So, I owe it all to Frank, hahaha! You know, I should really go to visit him and make amends. Start seeing what's possible via Josh H. Okay, time to go to work. -------------------------------------------------------------------------------- ## Mon Apr 18 22:13:04 EDT 2016 ### The Little Things in Life I just watched a one hour show after having talked to a friend for an hour, after coming home from work a little later than usual, after a nice round-about walk in the neighborhood just to see the sights. Wow! No pressure. No one else depending on me. Dying to see or hear from Adi. I don't think that's going to happen too much. I should call her. And I will, soon. I think I should come up with a plan to make every one of our old iPhones into a Facetime phone. Interesting! That and every Andriod into a Hangout platform. Adi and I could be calling each other all the time on all sorts of devices. Oooh, ooh, this goes in tandem with Tardigrade Circus, doesn't it? It certainly does. I'm so excited. Only question in my mind right now, is am I going to try to do work NOW? No, of course not. But I am going to look at my code, and then get an early night's sleep. Woot! It's really the little things in life. -------------------------------------------------------------------------------- ## Mon Apr 18 19:50:53 EDT 2016 ### Take THIS, A.I. 'Cause I Know Ain't No Humans Reading This Okay, this is unique. I continue right where I left off when I get home. Total throw-back to my Scala days, although I didn't quite have the savvy in those days to work effetely remotely, like I do today. Back then, to do a journal entry like this, I'd be typing directly into a web browser into a textarea that was created by my own crud generalized system agile framework thinkgamagiggy, although I didn't know back that that that was quite what I was doing. Oh, if only I wrote it for extraction and generic use. I sort of did. I used it at more places than just Scala, I guess it's okay to say. I used a forked version of it at Connors Communications to run their Web enterprise business software, which happened to be custom built on a SQL Server Microsoft ASP custom app written in VBScript. Hey, this was the stuff I knew! My generalized system would work super well here. Indeed, it plugged right in, hooked right to those tables, and replaced the existing UI-layer like overnight. I remember talking to that developer on the phone that first time listening to him explain what he had built. I'm like uh huh, uh huh, okay. Check. I can plop my magical introspecting auto-adapting generic nested external key sub-table auto-joining for orders, bill-of-materials, and all sorts of everything is actually really only a cruddy-nested-list stuff. When you think about it, it's all just linked lists. And LISP is really built on that fact to the nth degree power. But who needs it? We can all be wizards living a lonely life shooting lightning bolts from our LISPy towers at each other, proving who can out-DARPA the challenge, while we Pythonistas are already onto enjoying the after party, because our solution wasn't only there in under the time allowed, but we had already iterated several times, and had to choose between several equally excellent competing github projects destined to help people like you do precisely that highly specialized problem domain thing you're doing right there, for which you thought you must be the first, but are humbled by how well thought out this area already is in the Python world. End scope. Destroy all those variables. Matters not how long a dangled a participle or whatnot. Clean slate! Clean slate! This is Python, and if my performance sucks, it is not my fault the language. No, it is the fault of the CPython reference implementation blessed by our benevolent dictator, who nobly job-hops, because he's so FOSSy-bad ass that only Linus himself could make him look schooled. Guido van Rossum's Kung Fu is better than yours -- unless you're that Harold guy from Person of Interest. It's better than Matz's in my opinion, because Matz was trying to perfect the already kookily insanely still in use and beloved by many a cunning linguist that enjoys many of the advantages in a true natural language in their rigid structured machine language, borrow from everything, because DOES IT BLEND?! Yes it does. PERL blends everybody's favorite feature together while still miraculously managing to work, and quite notable for being the one more powerful thing past the Unix Shell that everyone agreed on for a generation or two, so now like every Linux install script is still written in PERL -- but that's becoming less significant with the popularity of the big free software repos like Debian (Ubuntu), RedHat's RPM, yum, many other name systems. The Raspberry Pi's repo is a derivative of Debian's like so many things. But there's also some really obscure out there on the edge repo's like Tiny Core Linux. And that's what I love -- for it's 8MB core OS size, and it's 60MB contains everything including Python, vim (to make it a developer environment too) and a running application in a webserver. In memory, you probably want 512MB of system space to run. But on the drive, my stuff including Linus' Linux and Guido's Python and Braam's vim, serving apps to the local OS, be it Mac, PC or another Linux desktop... geek, geek, geek, have probably at least caught up with a lot of my old Commodore friends. And I do mean old in every sense of the word. -------------------------------------------------------------------------------- ## Mon Apr 18 17:18:27 EDT 2016 ### Python Date Ranges Okay, let's do a Hail Mary play at getting this project done. I finally have all the date ranges worked out in Python. -------------------------------------------------------------------------------- ## Mon Apr 18 12:57:21 EDT 2016 ### GoDaddy Business Practices Distractions, distractions. Ugh, okay. Get some more caffeine in you, and then bear down on the webmaster tool work. I am so close, I was pissed off at GoDaddy for charging me monthly for a service I never signed up for -- or more specifically, that was included in another service I cancelled, and they started charging me for it after I cancelled the other one. I'm pretty pissed and pushed out an unhappy customer video. They're not going to care, but it makes me feel better. They charged me for a half-year of service I didn't want or need, because I was too stupid to notice/act-on the emails. And now with a bit of clear mind and a bit of financial panic, I'm able to go and cancel all these things. Hmmmm... planet fitness? The banks just switch over the auto-charges to the new credit cards, so "losing" the card isn't an option. Shit, have to either cancel it myself or ask her to do it... assuming she's not using it anymore, which is pretty fair at this point. Don't get distracted by that now, as the charge just hit so I have a little time before next month's charge comes around. Just don't forget. I did a very good start today with GoDaddy. As domains come up for renewal, renew them by doing domain transfers to another registrar. Start doing your "best registrar" homework. Tempted to just go with Google. Mon Apr 18 13:47:05 EDT 2016 Responded to Marat's dashboard in Tableau. I have to get to the SERP stuff pronto. I've had all morning, but sort of not really. Break down what you have to do right here, right now. Don't let yourself look at or do anything else until it's done. Mon Apr 18 16:08:02 EDT 2016 Wow, I'm pretty proud of this little project. Sure, these things take me awhile, but they're in the category of changing everything permanently and forever moving forward. It feels very much a parallel and complimentary to the work I'm doing at home with a deep apartment cleaning and organizing. Everything follows organization. Life is organization... bringing some little semblance of order to disorder -- inducing recognizable patterns. GoDaddy also really pissed me off as I realized they've been charging me $12/mo since September, that I have either not noticed or just overlooked because of the way my life was and where my head was. But now, everything is different. I wasn't that great before I was married either, but I have to get better. Just because I WAS shitty at certain aspects of life doesn't mean I'm justified in becoming EVEN MORE shitty at that stuff once I was married. It should be the total opposite, in fact. Getting MORE organized and MORE on top of things. At least now, I can. I can feel it beginning, and every night going home from work is a sort of blessing. She is still hanging out at the house during the day. But this time, I am the force of nature, and she is the one just stopping in for a visit. All clutter and mess will be swept up and thrown out, or sorted into Ikea bags to drop off with her on Sundays when I drop off Adi -- unless of course it's Adi's stuff AND it's still good OR its really sentimental. A small sampling of sentimental stuff will get a get-out-of-jail-free card. I will make a thing of it. I will make things significant and meaningful that should be significant and meaningful. Fucking-A, it feels good for the old me to start re-emerging. And it's not just the old me. It's the new me too, like the one who's putting the screws to GoDaddy for putting the screws to me. Fuck them for slamming me on their premium domain parking service just for not continuing their fucking domain club. They're going to refund me my money back to September, but I have to maintain a posture commensurate with someone who has a YouTube channel with 2 million views. Gradually start putting the screws back to people and organizations who try putting the screws to me. Feels good. I'm less of a pussy than I used to be. But really, I've had that capacity in me for a very long time, as certain people from my past have learned, after their shitty-person attempts to steamroll me were met by getting steamrolled right back. All started with my sister. If not for her, I'd probably die before I retire, just like dear old Dad. I still might -- that's what's at stake here. Better living through having a backbone, and longevity through levity while I tighten those screws right back. -------------------------------------------------------------------------------- ## Mon Apr 18 09:30:00 EDT 2016 ### Backing Up Note 5 Pics With Android File Transfer Ahhh, just barely by 9:30 PM. Gotta get better at that! Even this morning when I was awake plenty early, I end up being on a half-hour late time-frame. No excuses, and just don't let that happen. You CAN get to bed early, so now you MUST get to bed early, wake up early and refreshed, and get a few things done on that side before you even come into work, feeling good about yourself for a great set of things moving forward in your life before your day proper even begins. Tomorrow is primary day, and as a Democrat, I can really make a difference. Start making an actual to-do list. This is the only sensible place to keep it, but I don't want to liter up the top of this document where everybody reads, soooo... time to start implementing those hash tags that I think about all the time (in this context) and never use properly (in all contexts). Hahaha. But later. Rabbit hole evaluations. Check your calendar. Don't forget to do that! Okay, I have some overlapping meetings on the same topic starting at 11:00 AM. Okay, got out the email to manage that. Will try to keep both meetings, even with Bert out. Hmmm. Between now and the 11:00 AM meeting? Well, let's see. First thing is start backing up as much off your Samsung Galaxy Note 5, which currently has a cracked screen, but I do have the AT&T insurance, so I have to call them and initiate the replacement process. But first, I want as much as possible backed up, and it's time to start taking advantage of that terrabyte of OneDrive storage that comes with Office 365. The only nuance there is that the computer you're using ALSO has to have the space available, and I'm using these old Macbook Airs I have with their tiny SDD drives. Still, it'll get the job done. Looks like I'll be taking my Thinkpad to the meeting, as I'll leave my phone and the Android to Mac file copy (and YouTube upload) keep going on while I walk away. -------------------------------------------------------------------------------- ## Mon Apr 18 04:54:52 EDT 2016 ### Externalized DateTime Functions for Python Interactive Console Testing Okay, I chose the route of getting some sleep, but I am at work now before 5:00 AM, so I got that at least not terribly wrong. I was planning 3:30 AM, but hey, an hour and a half extra sleep can't hurt that much. Let's see what I can do with these few extra hours of focus in the morning. Okay, where was I mentally Friday when I left off, and where do I want to be when I get into work this morning. If I don't have the "Sessions Tab" done, that's fine, because it'll... oh wait, check if you have API access. It should use the same client libraries as Webmaster Tools... clarify. Yup. It's the same: pip install --upgrade google-api-python-client ...as was required for Webmaster Tools, so I'm going to guess there's a whole lot of stuff in there. Good. The OAuth2 logic will probably apply across services. Let's make sure that service is turned on for my Google account in particular... that's through: console.developers.google.com Okay, turned it on. Looks like quota are against individual accounts, and not the overarching (managing) organization. Here are the quotas: Analytics Reporting API V4 Quota group for the Analytics API. Daily limit per project for all requests. 0 of 50,000 Number of requests per project over 100 seconds. 2,000 Number of requests per user over 100 seconds. 100 Discovery requests Requests per 100 seconds 17,000 Okay, not bad. This writing is definitely key to getting back into the swing of things. This helps make my work-at-hand to be every bit as interesting as anything else in life I could be doing. That's really key to motivation here. With 7 billion people on this planet, someone (me) is bound to be turned on by this sort of work. Obviously, others have a passion for it to, so I don't have to feel alone. Yay, Internet! When last we left off, I had decided on an outer loop handling mechanism, and as I recall, it was to make everything part of one single object, and then I zip together horizontally parts of the object that are internally initially stacked vertically in the object, such as it were. resultlist[timewindow[rowdict{columns}]] So, I'm going to dump a bunch of results into a list object, and inside that list object is going to be a bunch of time-window object, which are also lists. So, so far we are dealing simply with nested lists. But inside time-windows, we've got a bunch of dict objects that follow the typical syntax for such dicts striving to be non-positional spreadsheet or rdbms table-data, with column names of the rows bound to the column data of the rows by virtue of dict name/value pairs. That way no matter how column orders change database-side (re-arranging column orders in the spreadsheet) or application-side (some sort of in-memory sorting), when the data gets sent across the channel, it automatically correlates everything to the proper targets. It's less brittle than positional manipulation, but a bit more verbose, and I'm sure way more processing overhead. Anyhoo, onto the coding. Ah! Python date stuff. I need to figure out how to express: - Last Full Calendar Week - Last Full Calendar Month - Last Full 90 Days All of these are from a starting point of 2 days ago, which is the last fully collected day of data in Webmaster Tools. This is a cool project! Long overdue in my career, and opens the door to tons more goodies out of google-api-python-client. Mon Apr 18 07:45:14 EDT 2016 Okay, ended up a bit distracted after all. Had to arrange some furniture... hahaha! More urgent than you might think. Anyhoo, don't run late today after all this. Shower and get ready casually, and get in early... especially for a Monday morning! It's time to be smart... extremely smart. I may not be fast, but everyone who interacts with me should counter with a "Yeah, but..." and go onto say why working with me is still the most amazingly gratifying thing. Github and this journal definitely play into it in a big way. I am thoughtful, and start the iterative process of improvement, transparency and documentation right from the start. Oh yeah, and externalized the datetime stuff so I can import it to an interactive Python command line for testing. -------------------------------------------------------------------------------- ## Mon Apr 18 00:08:31 EDT 2016 ### Act Like A Thinking Human Being Sometimes a single night like the one I'm planning can change your life forever, at the mere cost of a single fatigued day, but during which you can be proud of your work and feel good about yourself. The alternative is to feel shitty about yourself and fatigued. There's some in-between options, where you can get a little bit of sleep. And I can do one of the types of things I want/need to do tonight, and not the other, leaving the other for tomorrow during the day or at night, from a better position of having gotten one thing completely done, and a few hours of sleep so you can tackle the other tomorrow. The nightmare scenario is to piddle and waste all my time on the wrong things, and get NEITHER of the mission critical things done tonight. Well, I can feel diminishing returns kicking in after a 2 intense day weekend with Adi, and the need for a few hours of sleep. I think I can get those few hours of sleep and bounce right back awake, so long as I do it totally naturally. I think I can, I think I can, I think I can... again, so serendipitous to be reading the first book of the Thomas Covenant series again. Push, push, push! -------------------------------------------------------------------------------- ## Sun Apr 17 22:53:27 EDT 2016 ### Another Fine Weekend Just got home from dropping Adi off at her Staten Island "fifth house". Another successful weekend. It's definitely the highlight of my week, and this weekend despite wanting to get out to the Harlem movie theater to see Zootopia today, we ended up stying in the neighborhood with her mostly playing with Hazel and Oliver at the parks all day -- both Indian Road and Emerson. And it's not even 11:00 PM. Woot! Pat myself on the back. And we did get out and really enjoy the beautiful weather on both days -- yesterday getting out to Times Square and Bryant Park, and today staying local. We never did get into the woods to see the aftermath of the forest fire, but that was Adi's call. Lonnie and Oliver did get into the woods and came across what they figured was the remains of such an event, even though they didn't see it yesterday around 6:00 PM. Wish I got a picture of it from the roof with Spider and his family, but my phone battery was dead by that time. No biggie. Gotta decide how best to spend tonight. I'd like to go into the office tomorrow with the big project done, and if I really push myself tonight, I just may be able to do that. But I also have some other paperwork I need to do tonight. Whatever you do, make sure you employ the 80/20 rule and get the most benefit from the first 20% of the effort you put in. Don't choose the most difficult way to do a thing. Don't become your dad. Ugh! I found out wayyyy too late (from Jo Ann and company) that he had a reputation for that in the family. Guess it was reflected in how he decided to have a family, earn money, and eventually die very early. Just because he chose difficult routes doesn't mean I'm destined too as well. In fact, this knowledge should help me by serving as a reminder to always look for the most simple way to get a thing done. However, this very journaling contradicts that approach. This doesn't make the paths I choose necessarily any easier, nor does it guarantee even better. All it guarantees is that I think out loud about my situations, and perhaps am a wee tad bit more self-aware than some folks. That's all -- and if even that. Now, onto tonight. -------------------------------------------------------------------------------- ## Sun Apr 17 18:55:12 EDT 2016 ### Monetize YouTube Finally? It's time to switch over a few YouTube videos to make money finally, I think. I passed 2 million views on my channel, and I'm running short of money. I need to look at other ways to make income, and this may be it. I do the videos anyway, I can start with just the most viewed ones and see how it goes. Why make things more complicated? I almost get the chills thinking how effective I'm will be in life as I get my excess capacity re-directed into personal finances and personal environment. I'll have to keep up the intensity on the work-front and of course the daddy-front as well, but I think I have it in me. I think this is it. I think this is culmination of everything coming my way. -------------------------------------------------------------------------------- ## Sun Apr 17 17:40:49 EDT 2016 ### Python Garbage Collection Wow, as I gradually dig myself out, it dawns on me little by little quite how buried in helplessness I let myself become. Not good for my self-image, but thankfully I invested that part of myself that I usually draw from the awesomeness of my surroundings and my things into a much deeper, more internal and personal world impervious to garbage... interesting. I think there is something fundamental about how Java and many modern programming languages have garbage collection, while C doesn't. You have to be incredibly meticulous and self-disciplined to be productive in C, building accomplishment upon accomplishment, with your own personal library. But with Java, you can just focus on structure and patterns, letting the housekeeper take care of housekeeping. Fascinating that I never looked at the parallels in that way. So, what abut Python? Easy, scope! There's no garbage collection, but because everything is by reference unless otherwise explicitly stated, and functions obliterate their local scope variables when no longer needed and released, garbage collection in the accommodate-for-C's-shortcomings sense that Java implements it is not necessary. If you just stick to good compartmentalized namespaces and simply only hold a lock on memory on object that are still actually "locked-in-play" through the active scope (and it's invoking parents), the whole concept of garbage collection never comes up -- because it never has to! -------------------------------------------------------------------------------- ## Sun Apr 17 10:43:02 EDT 2016 ### My Plymouth Whitemarsh High School Experience As I clean, I will keep sentimentally significant samples of things. I think I remember back to my youth better than most. I keep encountering situations where people in my youth clearly don't remember their treatment of me, like the way Frank S ostracized me from my group of middle-school friends as we went into high school, calling me "************" in a way that managed to stick. I think it was a prick named Scott L who really coined the term, but the whole lunch table I sat at jumped in on it, and didn't let it go -- especially, Jordan R and Jason C. The other group of friends I moved to (down the table) were way more honest and long-term, and as I look back, culturally more diverse, and just genuinely higher quality people, like Guy B, Troy H and Jerry C. Eventually Ed Flocco rose above the others, and I found sort of a life-line in him to the world of Commodore and the Amiga Computer, which really started to heal my self-image, there towards the middle to end of high school. I think emotionally and developmentally, I've been stuck in a sort of time-warp back there. Loss of innocence in the cruel-world sense, but really just still silly cruel suburban kid stuff. Emotionally vicious, but still top 10% privileged percent of the world sort of problem, so I always brushed it off with that perspective, and never think I really dealt with the hurt. Now, after my first marriage and an upcoming divorce, I start thinking about these things as I take inventory of who I am and why -- and why my daughter's life is going to be about 1000x better than my experience. -------------------------------------------------------------------------------- ## Sun Apr 17 10:01:37 EDT 2016 ### Cabinets I did not wake up ahead of Adi this morning, although I physically totally could have. The cats demanded feeding at what must have been 5:00 AM, but I went to sleep and slept along with Adi until about 7:00 AM. I type in here from time to time to capture my thoughts, and when Adi sees, she gets mad at me for "doing work". I have to purge this nonsense out of her that anything having to do with productively moving one's life forward is work. She has the same reaction to tidying up or anything that isn't just the child version of hedonistic fun. I have so much to teach her! Deferring a few moments here and there for things that may look like work is actually improving the quality of your life in the future by a hundredfold. And even now as I type this, I feel the urge to stop before she "catches" me. I have so much reconditioning of myself to do. I tackled the kitchen cabinet organization and purging for only just a half-hour just now as Adi watches Peppa Pig, and I've accomplished more in the kitchen than I did over any given year. Just a half-hour, and it's possible now because the work won't be undone in a week with derision and anger as thanks for having tried to better our lives. We were just fundamentally incompatible on the domestic front. -------------------------------------------------------------------------------- ## Sat Apr 16 20:10:40 EDT 2016 ### Not an Indoor Day After All Just saw a forest fire in Inwood Hill Park up behind the tennis courts from Spyder's terrace. Forgot my keys, so cut over the roof coming back from Disney Store and Midtown comics with Adi. Adi got a Spiderman and Electro action figure from Midtown. Also did Bryant park and the carousel. Wow, what a day. Started making a dent in the cleaning too. Totally cleaned to guinea pig cage and turned it into a fort for Adi. Sat Apr 16 20:58:24 EDT 2016 Adi has a personality trait that will serve her well during life, but which for right now is can be enormously frustrating. Whenever she doesn't get her way, she enters either negotiations our outright laying out what the retaliatory consequences of her not getting her way are. Now this would be fine, but I'm still working her way through some basics that she's going to have to get from me, like taking nice care of special things and putting them away when she's done using them. Her experience with me here at this place is going to be a bit different now with no cleaning people. The safety net is gone. -------------------------------------------------------------------------------- ## Sat Apr 16 13:57:26 EDT 2016 ### Adi Wants an Indoor Day Adi really wants an indoor day, so I'll be taking the chance to gradually clean and organize without having spent all my energy from a full day at work (plus the commute). This has always been one of my issues. After the work week is over, there's so much to do at home, and I never had the opportunity, or when I did, any move to actually clean up, organize or improve things was met with... well, water under the bridge over the damn, whatever. That sort of thinking is poisonous. So, just keep moving forward... This is the beginning of operation awesome. I wish this is the sort of thing I could have done while I was a couple. This is some of the first time I ever felt relaxed and really at home in my own space, knowing that forward progress in my life won't automatically get rolled back by N+1 degrees of any effort I put into it. Put all your better thinking of late into all aspects of your life. You know, it's interesting, I attribute a lot of my new clarity of thought to Python. I think I'm gradually starting to think in Python. - Your environment is your check-list. - Do a couple of 80/20 rule passes. Knock off the most offending... offenders. Interesting, all I needed all these years was a day of puttering around. With Adi asking for an indoor day today, it's the perfect opportunity. So, it's beautiful outside. This is basically 10 years of backlog... even though I did move a bunch of times during that time, it's still a decade of disorganization. I am so glad I replatformed myself onto Linux and Python before Adi came. Were I not in this place mentally now, I would be feeling remarkably... what is that feeling? Hmmm. It's un-secret-weapon-itized. Having that through that time period may be the only way I fed the need in my soul for better living through organization. Now, I can have my environment reflect it as well, and it should be a thing for Adi to behold. I recall I tried that in the first place I lived with her in that tiny studio in the Upper West Side that had like a 15 foot ceiling, where I put a shelving system from The Container Store all the way up, and... well, I just stop there. Yup, that me is back, but this time for Adi and myself. It should be a thing for her to behold. -------------------------------------------------------------------------------- ## Sat Apr 16 09:46:01 EDT 2016 ### Why You Can't See My Google Search Console Code (Open vs. Proprietary... again) I have implemented a personal solution to maintaining a highly integrated life (work, home, daddy-hood, etc.) where I maintain both privacy and openness, favoring openness, but with the complete ability to hold-back that which must be held back for the sake of competitive advantage and proprietary concerns. I have had to do this because of how both private behavior and public behavior are both incredibly valuable assets in the type of life I'm trying to lead. I can't just be all open, because I draw a paycheck can't give too much away out of altruism. I'm always asking myself: who is the rightful proprietor of the knowledge, information, know-how, and most especially, data I deal with every day. Then I think about how much more data like the kind I'm dealing with is out there. It staggers the mind to contemplate how much data companies like Google and Apple actually own and have privileged access to. Oh, and Microsoft and the telecoms and a few others too. Any one of them could go into a very lucrative blackmail business overnight if they wanted to. The only thing stopping them is the nature of publicly traded companies and a bunch of nash equilibriums that amount to mutually assured destruction. The real danger is probably rogue employees, hackers, and just dumbass screw-ups that allow data leaks. Each of us needs to deal with our own little silos of proprietary knowledge and know-how and data in our own ways and how we see fit, keeping the same faux pas in mind that can bite the big boys. The existence of this admittedly over-sharing online journal expresses more-or-less where my beliefs on the subject land -- at least, for myself. And rest assured, I'm holding back. I pay the $7/mo to have my own private github repository, even though with a little more thought, time and overhead, I'm sure I could bounce over to bitbucket for free private repositories, and back to github for the open stuff. The fact that I don't expresses an important characteristic in my mind of integrated daily work-flow behavior. The repo for this journal (for example) isn't exposed to the public, so if I make a mistake in something I write, I at least have some semblance of a take-back, undo or mulligan. I could jump between github and bitbucket constantly, but I don't. The more complex edge cases and special logic and exceptions and exhausting overhead on things you must do manually you can purge out of the system, even at a little monetary cost, the better. This is why people will pay a premium for Apple products. As a long-time user and occasional advocate of both sides of the Apple vs. "everyone else" debate, I see and agree with both sides. It's not an easy answer, and those who are serious about what they are doing must pay something for the best (or simply must-have) tools out there. This is why I'm on Office 365, even though my employers tend to provide me some half-decade-old versions, but when I'm using PowerPoint, I need objects to smart-align when I drag them around, which only the new PowerPoint does. I haven't taken the plunge into either the $10/mo or $50/mo Adobe subscriptions, because I don't use these tools every day, and they take such a large commitment when you do. For them, there is gimp and inkscape, neither if which is really that tough to learn, and they're at your disposal wherever and whenever you need them. So, it's not a clear-cut split between proprietary and open. In fact, even within proprietary and open, it's not a clear split because everyone has to make money at some point to eat, and a lot of the open stuff is offered in value-added proprietary versions... as well it should be. This stuff is on my mind recently, because I'm documenting a lot of stuff here in this journal where you can't actually go and see the code, and I thought I'd say why. -------------------------------------------------------------------------------- ## Sat Apr 16 09:07:55 EDT 2016 ### How You Name Things is Pretty Important Hmmm. Next step? I've got the structures and mechanisms. What I don't have is the flow and the pattern. Once again, it's essentially a vlookup join that I'm doing between a bunch of tables that are very reasonably expressed as a list of dicts, which is what the Google Search Console API outputs as a result of the request. Maybe I'll make them all sub-elements of a larger Python object in-turn, so that I can do this with n-tables int he future. Sure. So, that will be a list of a list of dicts. More or less: timeranges[timerange[table{columns}]] ...but I need a stronger identity for timeranges and timerange or I'll get confused in the future. The whole shebang should be... hmmm... They're time samples consisting of different time windows. Time ranges? Time period? Oh, and implying the datatype... resultlist[timewindow[rowdict{columns}]] That's more like it. Pythonistas love their underscores, but I don't Too disjointed and interrupted to look at. It works against the strengths of human perception of pattern recognition. It takes a single visual unit that should be brought into the brain AS a single unit, and breaks it into two, relying on the brain to re-join them into one variable name or link or object representation or whatever. Exhausting to look at, and so actually un-Pythoinic in my mind. Surprised it became the PEP-recommended convention. And so, I ramble. Next step? Well, to populate this entire object with 4 timewindow samples of course, dipshit. Oh! Those time-window samples. Yeah, where I really left off mentally last night... Python's way of defining last week, last month and last 90 days, of course starting the windows back from TWO DAYS AGO, given the last fully collected day of data available. -------------------------------------------------------------------------------- ## Sat Apr 16 08:51:38 EDT 2016 ### Comprehending Dict Comprehensions Sure, there's Python documentation aplenty. But this looks like a job for the interactive interpreter. Hmmmm. Okay, let's whip out a quick list of dicts. And then, let's ensure we can use purely list comprehensions to fist iterate through the items in the list, and then each sub-dictionary item... >>> lod = [{'foo' : 'bar', 'spam' : 'eggs'}, {'big' : 'idea', 'small' : 'minds'}] >>> lod [{'foo': 'bar', 'spam': 'eggs'}, {'small': 'minds', 'big': 'idea'}] >>> for item in lod: ... item ... {'foo': 'bar', 'spam': 'eggs'} {'small': 'minds', 'big': 'idea'} >>> [item for item in lod] [{'foo': 'bar', 'spam': 'eggs'}, {'small': 'minds', 'big': 'idea'}] >>> [{key: item[key] for key in item} for item in lod] [{'foo': 'bar', 'spam': 'eggs'}, {'small': 'minds', 'big': 'idea'}] >>> Ahhh, Python. So the output of these two things may look identical, but the fact that they produce identical output is actually what I was going for. There's so much expressed here, it isn't even funny. Why are the outputs the same? In the first case, I'm outputting just the members of a list: >>> [item for item in lod] [{'foo': 'bar', 'spam': 'eggs'}, {'small': 'minds', 'big': 'idea'}] And in the second case, I'm construction a name/value pair dictionary for all the contents of a dictionary object: {key: item[key] for key in item} ...which just happens to re-construct the identical key/value data pair structure that's being fed into it by its per-iteration parent element, which is itself that same dictionary object. I'm saying for every key in this dictionary, output it's value. I'm making a one-to-one translator from the dictionary that translates to itself. Stupid, but asserts the precise control that I know I need, and practices the less discussed, but still very much available dict comprehensions (as opposed to list comprehensions). -------------------------------------------------------------------------------- ## Sat Apr 16 07:16:13 EDT 2016 ### This Looks Like A Job For List Comprehensions Okay, I've spent enough time contemplating my navel. It's time to start doing a bit of Python coding. The concept I'm doing... well, it's going to be at least 4 very big list-pulls against the Google Search Console (Webmaster Tools) API, and I'm going to have to stay aware of memory and such. Okay, whats coming back from the GSC API is actually a list of dicts. That's good, because for each dict in the list, I can join on a key from another list (and another and another). Soooo... efficiency? Speed? Being polite with memory? Quick research... the modern way to join many large lists of dicts? For each item in a list (obvious enough), spin through every item in another list looking for key matches. Ugh! There's keys (keyword), impressions, clicks and ctr. All of that data is good and I want to preserve it. So, that's a whole lot of columns. Keyword only occurs once, but there's going to ultimately be 4 columns each, for: - impressions - positions - clicks - ctr I also like that order, because it implies the relationship between impressions, positions and clicks. We can calculate our own AOL-leak-like click through ratios. So, the obvious (a Python equivalent of an Excel vlookup): for dict1 in list1: if 'keys' in dict1: if dict1['keys']: for dict2 in list2: if 'keys' in dict2: if dict1['keys'] = dict2['keys']: # What is the result data structure that I want? Basically, we use the first list of dicts as the master column against which everything else is joined. It's a lot like an Excel vlookup. I'm sure there's a much better way I could do this with list comprehension. Why wouldn't I? My own comprehension of list comprehension. However, these are the very precise situations it was made for. It would be insanity to deal with all this nesting, when it could simply be expressed as a number of "columns"... almost in a SQL-like way... almost with precisely the same solution as Aleksey provided when I showed him a repetitious SQL join on the version of this I did for search hits against URLs using our own database and data. Okay, take a deep breath. Commit this journal entry. Go read a bit about list comprehensions again. Should be very straight forward, actually. It should be EASIER with list comprehensions than all this nested if logic. -------------------------------------------------------------------------------- ## Sat Apr 16 07:02:07 EDT 2016 ### Expressively Encoding In Real-Time Well, well, well. It's 7:00 AM and I spent probably a half hour or more tweaking this journal, but you know what? That's fine. The way it's continually refining and shaping, it's just eventually going to be my memoires or whatever. I'll sink my old journal that I too back offline but which is still in the repository back into this main head of the repo (bring it back) again eventually, along with my Scala Webmaster Journal, which is probably in pretty good shape to just be pasted into here (all text) and give a bunch of context from 15 years ago to all the new stuff. And then, there's my 30 years of paper journals. Yeesh! Oh well, who the hell knows why I do this shit. Oh, this very in-the-moment thinking is why I do this shit. It's about navigating difficult courses by forcing myself to think out-loud, and to essentially practice coding by forcing myself to be constantly encoding ideas. It's amazing how few draw the parallels between actual spoken language and programming languages. Larry Wall, I think I need to read more of your writing. Your Big Ideas video was spot-on, and all that philosophy about linguistic freedom... nice! However, most peoples' minds are just not powerful enough for PERL, and for we mortals, there is Python. -------------------------------------------------------------------------------- ## Sat Apr 16 05:19:46 EDT 2016 ### sample_tools.py Well, well. Do I have amazing powers finally again, or what. Today is all Adi's but its 5:30 AM. The morning is mine! Coffee in-hand, and challenge from yesterday still fresh on my mind. Private github repo created, and great progress being made. Latest challenge actually interesting and clear. Not only does this work hearken back to my Scala days, but I'm remembering the first time I dealt with datetime issues on the Coleco Adam, back when I was 12 or 13 years old, and reading The Chronicles of Thomas Covenant for the first time! Wow, are synapses being connected in my head! Okay, next! Okay, purge some mystery out regarding this mysterious little line: service, flags = sample_tools.init( argv, 'webmasters', 'v3', __doc__, __file__, parents=[argparser], scope='https://www.googleapis.com/auth/webmasters.readonly') Sheesh! Okay, so first, we're setting a tuple value. Very Pythonic way of returning the output of a method. The method call returns a tuple, and it's being unpacked immediately into two separate values. But I need to actually look at what sample_tools.init() does, and to do that I need to actually find where it is. So, it's part of the googleapiclinet, which we can tell from this line: from googleapiclient import sample_tools Okay, where are my externally installed Python packages? Usually /usr/local/lib/python2.7/site-packages/ but let's check... import site; site.getsitepackages() /usr/local/lib/python2.7/dist-packages/googleapiclient Okay, go find the file and open it in vim. Oh! Hello, Joe Gregorio. You write some awesome code that one must do some detective work to decipher, but isn't that always the case? It's to externalize a lot of repeated code that occurs a lot through the samples. Thing is, I'm not using a lot of the samples... I'm using just one, and I need to begin to understand it thoroughly, and get rid of being forced to use sys.argv for parameters, executing the command with: python keywords.py x y z ...only just to simulate parameters that appear to have to be there for argparser to do its thing, only so that I can swap out the values explicitly later on, setting sys.argv directly. Very bad practice, but it's working. But I can't let it stay that way, so this is the first measure to wrap it in and wrangle it under control. ...wow, I successfully "internalized" code, in the sense that you normally externalize this sort of stuff. Peeling away external library mysteries. There's a lot of oauth stuff in there that will be useful for Pipulate. Let's see how Google does stuff. Anyhoo, this was a massive success. It didn't move the actual results of the function forward that much, but it has shifted massive amounts of control regarding this project into my hands. Pick this apart and deconstruct it to see what's going on. Purge the use of sys.argv. Methinks I detect the presence of named tuples here, per Fluent Python. -------------------------------------------------------------------------------- ## Fri Apr 15 14:43:58 EDT 2016 ### Simulating sys.argv Sucks Yeesh, I'm really going to have to work out this financial stuff to be able to concentrate. Once Pipulate is stabilized, I should really go the promotion route, so that plenty of people see it and can use it, then maybe do Patreon or a hosted/paid version of Pipulate. But, let's continue through this automation task. I should look at the object type coming back that gets formatted by the print_table function. A rows object is pulled out of the response object: rows = response['rows'] Each iterable item in the row object contains: row['keys'], row['clicks'], row['impressions'], row['ctr'], row['position'] Okay, so obviously, it's a list of dicts. Woot! We're zeroing in. I have to alk out of here at 5:30, and it's already almost 3, so... so, work faster than is your norm and pray that there's no email distractions. Okay, I hacked the sample script to simulate getting a sys.argv, even though I'm not passing in parameters. Sheesh! I have to use four different sets of parameters, one for each time-range, so I have to break the model set forth by the sample. There's going to be a much better way to do this once I understand what sample_tools.init() does, and why it has to be passed the whole friggin system.argv object. Could this sample be any less flexible context-wise? Shit, okay, running out of time. 80/20 rule next step? Fri Apr 15 16:04:54 EDT 2016 I pushed it up to Github in a private repository. From this point, I should meticulously track my changes. Fri Apr 15 20:34:36 EDT 2016 It appears that GSC's last full available day of data is always 2 days prior -- NOT yesterday. This has impact on last week and last month as well. -------------------------------------------------------------------------------- o# Fri Apr 15 09:24:30 EDT 2016 ### Google Search Console API Automation (Previously, Webmaster Tools API) Okay, just paid my maintenance for the co-op, and I'm going to have to deposit that check first thing on Monday, when I get into the office to pick it up. I need to get my act together! Sheesh! This money shortage thing has got to stop. I need some sort of safety margin... or do I? Really, that just removes the panicy onus to get my house in order... the real thing at issue here. Spending is out of control, and I have completely over-extended myself on recurring expenses. I cancelled the cleaning service, which is a start. But I need to do more. I need to start pinching pennies and get down to the essentials, and give up some luxuries. But first, I have to focus on this SEO Report. Test your VPN connection, to be able to get into the office network from home today. Even if you can't, you're fine given the nature of the work you have today. Go open the Webmaster Tools API page... or rather, the Google Search Console Fri Apr 15 09:46:25 EDT 2016 When I do things right, I cross over into another place that is hard to access, hard to get to, and hard to maintain once there. If it was easy, everyone would be doing it... but alas, that doesn't seem to set me above and apart, as I'm busting my ass as much as the next guy to keep my head above the surface, and that's not right. It's just about keeping a little bit more of the reward. Live on this razor's edge a bit longer, but use your newfound time to course-correct. API page... Okay, it makes no sense to do this work anywhere other than on the Pipualte DEV server. In the sake of time, I'll use the existing dev server, but I do need to get it off Rackspace and onto Wable soon. I just switched to my buckling spring Unicomp keyboard as a favor to my fingers, and a pleasure I haven't been able to do for a long time. Okay... Log into the Pipulate development server... okay. Murphy's Law evaded. Next? Just check running Python in console mode... okay. Now, go find the Python libraries for Webmaster Tools. - https://developers.google.com/webmaster-tools/v3/quickstart/quickstart-python Okay, first I need to create a project, enable the API, and set context. I'm up to the context question, and the choices are: - Web Browser (JavaScript) - Web Server (e.g. node.js, Tomcat) - Android - iOS - Chrome Application - PlayStation - Other UI (Windows, CLI tool) - Other non-UI (e.g. cron job, daemon) Well, that's MUCH better thought out than the last time I was here. Interesting observation: OAuth2 is part of the war(t) to keep the web browser relevant as it comes under attack from native apps and such. By actually requiring a browser in the picture to authenticate (easily) you ensure that... well, that a web browser always stays in the picture, and that Google can continue offering services to the masses who increasingly accept the defaults of their phone or new platform, because its so fatiguing to keep customizing every new platform with a smattering of apps from ages past you stopped using long ago (the Google mobile app, Google Plus, etc). Google's last line of defense is to hope that a standard browser is still in the picture... okay... back to the task. Which to choose. While it WILL be running like a webserver, I would like authentication to occur like a non-UI CLI app for when it gets scheduled and such. Okay, I have to make a service account name and select a key type. I'll choose JSON. We download this to recover our credentials if ever lost. Got it. Now, we create our credentials. We select from: - API Key - OAuth Client ID - Service Account Key Oh, it's going to be a service account key, for sure. The less OAuth-like we make this thing, the better. This project is going to be... uhhh... oh, is that true? Shouldn't I use the Pipulate user-provided OAuth credentials, and NOT run this thing like an automated server thing in the background? Hmmmm. Not on version 1, for sure. Make a test script that works in server mode... hmmm. Oh yes, that menu was to create NEW credentials. I can create new credentials to switch this over to self-help with user-data with their permission under Pipulate, after I get the first round (what I need for today) actually done. So, let's just go with the credentials I already downloaded, which I believe is that .json file... that CONTAINS the credentials. That's going to be like a username/password file, or like an OAuth token. Okay, next step? Client library, of course! pip install --upgrade google-api-python-client We copy and paste some code into a file named webmaster-quickstart.py Ugh! There isn't a client secret field in the json file. This sort of authentication context is different. There's a big fat client key in that json file, which makes sense, but that's an experimentation hurdle in altering the sample code that I don't want to put ahead of having a good looking report ASAP. Soooo... so... Manually pull the Keyword data right now PRONTO! Keeping the columns aligned across 4 time-ranges will be a massive vlookup project with lots of supporting tables. Ugh! Don't pull it manually. Get the login working. The OAuth libraries will surely support it. I'm working off all the latest now, and there should be examples of how to use this json file as an alternative to the client ID and secret. Of course! Yeah, the quick-start guide at this location: https://developers.google.com/webmaster-tools/v3/quickstart/quickstart-python ...tells you to actually surf to that file. The assumption is a web-based app, which is not my case. So, I simply have to find the example of authenticating using using that json file. Go look... https://developers.google.com/api-client-library/python/auth/service-accounts So, get that file onto the server. Copy-and-paste through terminal is still easier than scp. Okay, done. Now... let's see... Baby-step up the login example. Not sure if I'll use the webmaster-quickstart.py example they provide. Too many moving parts at once. Build it up. Hmmmm. Is this ready for Github? Why not? Baby-step this thing and make some web documentation out of it. Let's give it a name, first. Oh yeah, github! Well, let's find a similar project... https://github.com/chipoglesby/searchconsole/blob/master/search.py ...though aside from the old-style client_secrets json file, I don't see how authentication is occurring there. No worries, just use the example that uses the file to create a credentials object... Doh. This is the official Google sample to be starting with: https://github.com/google/google-api-python-client/blob/master/samples/searchconsole/search_analytics_api_sample.py Fri Apr 15 14:02:26 EDT 2016 Okay, pshwew! I finally got the data pull to work, but I had to use OAuth2 with the client_secret.json file, which means I have to jump through hoops to actually authenticate. It's the enemy of automation, but at least I will be able to adapt it to do the 4 necessary date-ranges and zip them together Python-side to avoid vlookups GSheet-side. This is still the best approach. Now, it's time to alter the sample script until I have what I need. They've arrived. Adi has "after-school" classes today, so I'm sure she will be dropped off there around 3:30. She will surely ask me if I can pick her up at 5:30. It's a difference between 5:30 and 6:00 PM, so it would be quite rude of me to say no, but it's already 2:30 and I'm not as far along as I need to be. If I get Adi to sleep tonight, I can pump myself up on caffeine and finish this out. I wish I was faster at this stuff, but... oh well, it is what it is. Some might say I do too much writing, but this is half the reason for doing any of this stuff at all in my mind at this point. We are leaving impressions. Let's see, I got over the big hurdle. Figure out 80/20 from here. Tear out everything but the top keywords request, and re-execute query and hope it doesn't challenge for authentication again, and if it doesn't look at why. Okay, done and it doesn't and a webmasters.dat file is written out, which even has a refresh token! Okay, this approach may be more workable than I initially thought for automation. I think it's time to commit and push before I finish out this project. Publishing part-way to the finish line... -------------------------------------------------------------------------------- ## Fri Apr 15 07:58:33 EDT 2016 ### Could'a Should'a... Live, Learn, Improve It is high time to focus on Webmaster Tools, now called Google Search Console. Sigh... another blow to old-school SEO. Where's Webmaster World and the pubcon crowd these days? Gradually fading into obsolescence? Forward-movement, and re-invention and the gradual re-writing of history -- or at least, expectations -- for a new generation of marketing pups. Ah, so good to have Robert Ringer's view of the world fresh in my mind, and really never having forgotten one wit of my learnings from Scala Multimedia Digital Singage blah blah could'a should'a owned the world just like the Amiga it was derived from. But the world is full of woulda's and shoulda's that died, lessons lost. Library of Alexandria is how Carl Sagen laments over this issue in the opening of his Cosmos series, the original and the real-deal, genuine-article version by that name. So rarely do things cross over into that divide between "should-make-a-difference" and "did-make-a-difference". There are so many people in this world involved in so many endeavors that even if 1 or 2 percent more came to fruition, the world would probably be a much less stable and more turbulent place as new ways duked it out for supremacy. Python and JavaScript both passed over that massive divide recently for profoundly different (and complimentary?) reasons, so I embrace both. Python is like the Motorola 68000 at the heart of the Amiga, and JavaScript is like the Blitter... oh, and vim is like the Copper. Hahaha! What, maybe 1 or 2 people left on this planet would even understand that. Mike Sinz, I'm talking to you. Is RJ still alive? Oh, and Dale. And a hand full of those who inherited the thing at Commodore, and of course a handful of insanely once-in-love fans. And here I go down the writing rabbit hole. Well, it's still only 8. Get that check written for co-op management, and go drop it off and tell them about the leak in the bathroom. See if you can't get some of that super love you almost never ask for because it's always been a problem for them to come over and fix things, because there has always been someone here to meet them. How different life can turn out than the way you think it could'a, should'a. -------------------------------------------------------------------------------- ## Fri Apr 15 06:10:35 EDT 2016 ### Healing Well, I could not help but write last night. Much on my mind, and that is how I choose to use my discretionary time -- or, at least did last night. The, I could not help but sleep. Wednesday night I spent with Adi at her request, though I did not exactly know the whole context of the situation, as I almost never do when it comes to her. Today will be interesting. I will be working from home today, and I cancelled the cleaning service. That will save me almost... well, I'm too embarrassed to say. Suffice to say, it's pretty big. I have to make brilliant use of time today, and if they do pop in, I have to emphasize the importance of leaving me be. I will close a door... imagine that, closing a door without guilt or apology. Shit, today better work out. The only thing that may distract me a little bit today is organizing, which will actually help me be effective. I did get sleep because I needed it, and that will help me today too. It's still only 6:00 AM, and what I can do is slam out the SERPs tab, with total focus and a fresh mind. Fri Apr 15 07:51:36 EDT 2016 Today can either be a miraculous move forward in my life (my first somewhat uninterrupted focus day at home in maybe 10 years -- though still not a full day and still not fully uninterrupted). But still, a big potential step forward if I plan it right. Don't squander time. Already, I took a hot bath, had a relaxed coffee and read a chapter of Thomas Covenant. At least I did them all at the same time. And now I come out refreshed, and my mind and body invigorated and healed. Ah, it was meant to be that I re-read at least the first book of this series now in my life. My last re-read was the first of Isaac Asimov's Foundation series, so from a tech and miniaturization and forward-thinking long-game perspective, I'm replenished and refreshed. And now Lord Foul's Bane is sort of another version of that, but the inner world and healing and similar high-strategy and long-game. I get a lot of long-game stuff into my head. I think I need to beef up the effectiveness of my short-term plays. I think I'm in a much better position to do that now than ever before in my life. Python, Linux and the Web Browser have filled the emptiness left by the demise of the Amiga computer in my life. Perhaps vim a bit too. -------------------------------------------------------------------------------- ## Thu Apr 14 21:33:15 EDT 2016 ### Strange Attractors I deserve to treat myself well. I deserve to be doing many of the things I imagine myself doing when I think about treating myself well. Voyages of the mind are much more important to me than voyages of the body. I don't want to put down traveling or travelers, but I find no deep meaning in being a place, even if it's a mythical history I've been learning about and even admired all of my life. Grand Canyon? Meh. Of course, the conditions under which I encountered all that probably desensitized me. And when I started reading The Chronicles of Thomas Covenant, The Unbeliever, something spoke to me. It held me. I could have put down the book with all that arcane, impenetrable Tolkenesque language, but I didn't. I stuck with it, and got through I believe 6 books. I know he's done more since I read them starting at 12 and through my teens. It was my first Epic reading experience, and Covenant's deep inner world and competition for what was to hold that place as his inner world between leprosy outcast unclean, happily married successful writer, and reluctant hero of a strange diseased land. How could I not see the leprosy parallels immediately? Oh, I was 12? And now I'm re-reading it at 45 years old. Wow, talk about connecting points in your life. Israel and Bar Mitzvah and Camp Watonka near Lake Wallenpaupack and the kid I went to camp with who always used this world as a backdrop to his Dungeons & Dragons adventures. Sheesh! That was another kid who got through that book at that age. I wonder what ever became of him. And all those other kids. That was the summer of 1982, right before my Bar Mitzvah, where I first learned to code a tiny little bit on the Radio Shack TRS80. I never got the Apples, and half coveted them even then, because of graphics -- or maybe because everybody else did and there were fewer of them. I can't remember which. Ah, and so I transform myself in my mind, just like I did in those early days of my youth, from a 12 year-old turning teen. Oh, what a sad, awkward teen I was, and what an only slightly less sad, and still quite awkward adult I have become. I find great joy in my head. I marvel that an emergent side-effect of evolutionary pressures has led to me typing markdown into vim on a highly mainstream Unix machine, committing to a git repository, and pushing up to Github where it automatically publishes through the magical mechanism of github.io, which I really should learn a bit more about how it works, given that I'm resolving the apex domain mikelevinse.com to, so I can talk about all things SEO, and many things which are most decidedly not. Yes, I have a good time in my mind. Maybe I should be a writer or something. Book fodder. I have too much fun writing to worry about writing books and traveling and shit. Shit. Coding is writing. Okay, feel good. You're spiraling around something now, caught in a distant gravity field. You just don't know it yet... or rather, you are only barely at the edge of suspecting. -------------------------------------------------------------------------------- ## Thu Apr 14 20:50:37 EDT 2016 ### Gravity Wow, there were 275 lines of insertions into this file since the last time that I sat down to edit it from this particular machine. Very interesting. I'm going to have to teach Adi how to type, and how to use old-school computers before they went all touchscreen and voice recognition. It may be that we never have a need to touch hardware at a low-level with coding again. It may just be things go all meta, and pretty soon only machines can program machines at the low levels. We're already pretty much there with compilers. Nobody programs for Assembler anymore -- or at least, very few people do. Maybe the embedded system guys, squeezing every ounce of value of burning down that sand into silicon. Ah, semiconductors. Ah, the trick of switches, and a digital bias in all things since we figured out the integrated circuit. How much rich technology and capability and unique ways of interacting with the world are we denying ourselves due to our overwhelming digital bias? And I don't mean electronics bias, because much of electronics can exist without digital. It's more organic and biological in nature, relying on many of nature's miracle properties exhibited by materials under specific conditions, and not just the one very simple property of semiconductivty that makes switches of solid state materials for rivers of electrons to flip and flop this way and that, as they barrel through friendly paths laid out for them, herded and corralled through clever little doorways that makes calculations of their directional rush. There, I have sufficiently turned away all but the most dedicated followers with a dense and inaccessible wall of babble. If you're still reading down to this point, and you are with me nodding your head "yeah, yeah", then I applaud you, my loyal fan! You have an attention-span like few others. I watch who reads this journal and how, with a 3rd party tracking system called HotJar. I've been aware of this technique since ClickTale, a sort-of contemporary of HitTail that is also still around, but it will be interesting to see how they hold up under the competitive pressure now of HotJar, that gives away up to the first 300 recordings for free, so you can really get a feel for the value of this thing. Every day at ZD, I am newly impressed with how good a match I am for that company, and it for me. Wow! Have I been in training for this ever since Scala... and maybe even my Amiga days, or what? Yes, I am working with minor celebrities in some cases, shall we call them long-tail celebrities. And we really do have some amazing talent and assets as a company. And I'm going to unleash a can of "been waiting a long time for this" all over this company. Serendipity pops to mind. I wouldn't be able to make the most of this opportunity were it only a week or two ago. My life has changed significantly. And it's not that I'm overflowing with time now. Rather, it's that I'm only just typically time-strapped. And now that I'm a weekend Dad, there are no weekends in terms of me-time. Taxes and all that... shit. So bootstrap yourself, young man! You have a lot of time left to have your second or third awesome phase of your life. Frig! I've had a kid. I'm a dad, having brought precisely one new life successfully into this world, as has my dad. Check! Now, move onto being an awesome dad in ways my dad never was, for the precise same reasons I'm stressed out: work ethic! Dad over-extended himself making an awesome life for himself and his family, only to be thanked by his wife and my mother leaving him. Ultimately, it was he who wrote up the divorce papers and sent them to her, even though it was totally her who left him. That's just the way it works out sometimes, and there I go repeating my Dad's pattern. Ugh! Hopefully, I won't go obsolete, have to sell my house to buy a business (requiring humbly needing the ex's consent, because of co-signed mortgages) by selling the home you bought and your kid grew up in to buy a business where you stress out and die, and your son has to shoot someone in the process of running it when he has to take it over, per your very explicit instructions for him to do so in the case of your death... shit. And so... and so... I write. And I go all meta on my own ass, and yes... it's all about me AND my daughter now. No, it's not just all about me. And you know what? It never was. I am a most giving person, and no, that giving does not make me a martyr complex person like that asinine writing of the 70s made everyone think of naturally giving and altruistic individuals. Sometimes being a good person is just for the sake of being a good person. It makes you feel good about yourself to share your life with others, even if the value exchange seems pretty out of whack. Well, I thought equilibrium would eventually be established, and a sense of fairness settle in. Life ain't fair, I know that. But at least there ought to be some tiny little indication that you're in it together, and not just feel relegated as a tree to see how much you can tap and draw out of it. Yep, that's pretty much how I feel, and I haven't had that feeling since... well, since the heartache and grief of my own nuclear family's falling apart. And so, the larger unit that was my family fell apart. It was an unstable molecule, anyway. The bonds that held it together were weak in the first place, and when they weakened even more, the interior repulsive force that keeps everything from collapsing into one point kicks in, and starts spreading certain core elements apart. Valence electrons pop between different potentialities, and the very properties of the molecule shift. Oops, that one was radioactive. Or maybe an alpha particle hit it from something that was radio active. In any case, we follow particle dynamics, I'm pretty sure. We're on the surface of the earth, which I'm pretty sure is like electrons on a valence shell -- even though there's what, like 8 billion of us? No matter, scales and numbers of things inside clustering and grouping and containing units can vary, but the rules are broad-strokes the same. Things revolve around things because stuff gets bent, and they spiral in towards each other like water spiraling around as it goes down a drain, closer and closer and tighter and tighter... until perhaps interior repulsive forces assert themselves, and after a little bounding, equilibrium is met, and the particles go into mutual orbit. Nuclear families. Parallels abound. I come from a molecule that broke. I became exactly one such molecule that broke. What do I expect? How much is nature and how much is nurture? What have of me is Dad, and what half is Mom? I'll never be able to compare now with all of them gone. Well, there is Roz. And the uncles on her side... ba-dum bomp! Nah, really, who cares? I am who I am and these are the cards dealt to me, and now what am I going to do with them... again? ... from here? All good. Who really won the race of life if nobody we know of actually truly became something like immortal or transcendent or whatever? And no, I'm not talking about religious bullshit myth, although I'm quite certain a few utterly remarkable people have inhabited this planet -- truly other particles that only just barely manage to exist in the plane of electrons. Not actual gods, but trans-dimensional Wesley Crusher blip-creatures that maybe perchance pop in on us, or maybe perchance we're born as, or maybe perchance we claw our way up through some growth process to becoming. Individual growth perhaps, or more likely as Iain Banks fills his pages with, that we evolve into. Yep, becoming one of those would probably... but not definitely... qualify as winning at life. -------------------------------------------------------------------------------- ## Thu Apr 14 19:01:19 EDT 2016 ### Emancipation Pipulation Okay, I need to create the confidence to "walk away from Pipulate" while it's processing, knowing it's going to be finished upon your return. 100% confidence... or at least 98% confidence... so long as you don't close your browser window or your machine doesn't go to sleep. The Web-based UI needs to see some activity on the other end, or else the open http connection (the single original page-load) ends, and Pipulate stops processing... by design! The Web UI is special with Pipulate. If you want it running in the background, then you schedule it. Maybe I should make a "run in the background" option. At any rate, it's time to start thinking and talking out loud about the next stage of Pipulate's evolution. It's turning out to be a bit different than I imagined. I think I'm going to make another entry-point via Flask, and fork the behavior based on the entrypoint. I have webpipulate. I just use Flask's routing function and make a different outer-loop and login context. Adhere to the 80/20 Rule, and Rabbit Hole avoidance. Right now for example, I'm grabbing titles from about 1000 pages. I want to take that machine home with me tonight, and I can't unplug it, because it would stop Pipulate running. I would like to be able to do something, like running it in scheduling mode but in an ad hoc right away way. I'm thinking that will be very popular. So, next evolution of Pipulate is 99% reliability when using it through the Web Interface, and 99.9% reliability if in either scheduling or background mode. Okay, so... what are the components of the 99% reliable... oh, AND much faster... Pipulate outer loop? - Using the sort of OAuth2 login context that's used for Mobile or offline apps - Continuing to not need anything stored (between Pipulate sessions) on server - Storing the OAuth2 refresh token on the browser in localStorage - Encouraging the use of localhost with QEMU, keychain or gumstick PCs and such - The ability batch rows for chunkier and faster updates. # of rows in config - Take Re-Pipulate out of the UI. Too dangerous, even with version history As I watch Pipulate run, so much about it is spot-on correct. I got a lot of things right on the first pass. Now, it's time to up my Python game, and get some of the value out of listening to all these Talk Python to Me, and delving into some of the books mentioned. It's also time for me to start publishing that Advanced Python article I've been thinking about. -------------------------------------------------------------------------------- ## Thu Apr 14 15:45:50 EDT 2016 ### dip don't grok Okay, continue using Pipulate day-to-day, but don't just go plowing new functions in and revising framework. It's time to... no, I don't want to fork a branch of my own work, although there will be a big round of experimental revision. Instead... hmmm. Yes, make a wholly new parallel project as a testing framework against the gdata API using the gspread package. The most painful thing about Pipulate is by far the timeouts, due to both local OAuth2 expiration logouts during a long job AND just general API flakiness. This is the stuff dipshit basically attacked me like a rabid dog over and had no patience or appreciation for the sausage factory of Github to endure. Haha, well his loss. I keep chipping away, and eventually get something rock-solid and all the show-stopping gotcha's purged out, and those "special advantages" that I'm looking for included -- the out on the edge crazy stuff that dip don't grok. -------------------------------------------------------------------------------- ## Thu Apr 14 12:39:51 EDT 2016 ### Money, Money, Everywhere and Not A Cent To Spend Just cancelled my cleaning service. I'll probably start using it once a month or something. But shit, I can't afford that any more. I still have two huge payments to make this month, and I'm rapidly going down to zero in my checking account already. My paycheck arrives tomorrow, so I'm not in terrible shape. But I'm tired of this dropping to zero shit. I shouldn't be living like a poor person at this stage in my life, and earning like I do. I have to start budgeting much more diligently -- or even at all. Also just had lunch and broke a lot of mental momentum. On the plus side, immediately before lunch, I met with a guy on the commerce side who does a lot of reporting and has automated Python scripts running to help. I grabbed his whole directory of automation stuff, so I'll be able to pick that apart. I need to re-calibrate mentally and figure out next steps, and systematically plow through it all today. Don't take so friggin' long! Okay, focus on the Commerce URLs again, using a seeded list. Function calls are very expensive in Python. Inlining can increase performance a lot. You get it for free with PyPy, but not with CPython (the main interpreter). Python has a hybrid thing in the object model that Michael Baer talks about in an old blog post. Dig it up. Certain very handy patterns use it. -------------------------------------------------------------------------------- ## Thu Apr 14 09:45:42 EDT 2016 ### Aleksey is Awesome And FOCUS! At 11, I have the meeting about commerce URLs. Push hard to get the project in much more real form in this very hour. Listen to the Talk Python to Me Michael Kennedy podcast about SQL Alchemy and Michael Baer. Interesting! There's a core vs orm issue with SQL Alchemy. That's raw-SQL versus object-mapped calls. - Start a new Google Sheet called SEO Pulse Mock-Up - Get the book Patterns of Enterprise Application Architecture (distraction) - Respond to Aleksey's message yesterday about his WMT API success - Embed the mock-up picture on the first tab of the new GSheet -------------------------------------------------------------------------------- ## Thu Apr 14 08:44:56 EDT 2016 ### I Am Not Dead Yet To find your way in life, just be yourself. Of course, your entire life is a journey of discovering who you are... just in time to die, haha! Morbid? Nah, realistic. At some point you just gotta get with the fact that approximately 100 years (for easy math & if you're incredibly lucky) isn't really a lot of time, and you just gotta do with it... well, something. You're not going to just give up and die at like 12 years old. It takes until you're 28, and on the stellar path to the wrong kind of success (and usually in the music industry) before that happens. Though of course in my world, my mind goes to Aaron Schwartz and Ilya Zhitomirskiy (not technically 28, but still). Lesson? Pull back! Slow down! Re-evaluate. Don't invent game-of-life rules in your head if they're going to drive you down self-destructive paths. Okay, it's still only 9:20 AM. Focus and get on the right track today. These SEO reports are for very soon delivery. I have to hammer them into place today. Finish what you would consider the visual mock-up where real data can be plugged in. That's been the goal for awhile now, but... but... but... caught in loop. Escape loop. Rabbit hole evaluations. 80/20 rule. Everything you've learned and know... a BETTER set of game-of-life rules that WON'T drive anyone to kill themselves. Take things very seriously, but only insofar as you love that ongoing paycheck, and the situations you find yourself in every day. If you love those situations (and they don't sap you of all your energy every day), and you love the reward for BEING IN those situations, then be passionate about... well, not about preserving it, because that only makes you a tool. Be passionate about driving things up and forward to the next level, to increase your return on investment. This shouldn't make your life harder. Rather, it should make it EASIER. All caps is so much easier to remember than *markdown*. Okay, next step here and now. Check your schedule. Okay, a discussion <pre> My mission is modest My purpose is clear It's doing great stuff So you'll know I was here I am not a giant Who's moving great boulders But rather, I'm happy To stand on their shoulders Sharpening tools That I wield in my labor Folding and honing A nano-sharp saber Which I whip out As I try not to fret About obsolescence I am not dead yet </pre> -------------------------------------------------------------------------------- ## Wed Apr 13 15:42:17 EDT 2016 ### Use SQL Gurus When You See Them Pshwew! What a session with Marat. Okay, I have to continue plowing through this. I have some Google Search Console work to do today, but before that... hmmm, okay... look at the efficient version of the query from Aleksey. Okay, the SQL is getting insanely hard to my way of thinking. Aleksey's version of my query is much better, having avoided joins and code repetition, but now I need to start documenting and tracking this SQL in vim and git. I'm not on the ZD Github repo yet, but I am paying for my own private repository, so get it going! You can always move it later. -------------------------------------------------------------------------------- ## Wed Apr 13 09:02:07 EDT 2016 ### Acceptance of the Mac, Screenflow & VMWare Fusion Advantage Haha, 9:02! Getting better. This morning was the first morning the cats woke me up at 6:30 AM that I didn't feel like going back to bed. I stayed up after feeding them, and eased myself into the day for the first time in memorable history, where I didn't feel rushed or short-changed for sleep. The timing couldn't be better with the size of the challenges at work. Windows rebooted my system overnight to apply updates. Time to re-instantiate my Windows 7 environment. I'm on 2 screens. Since it's Windows 7, I'm back to VirtuaWin for my virtual desktops. All the better, since I can pin my Outlook calender to my second screen, so I never lose track of meetings. Outlook reminders are wonky, I believe because GMail is our main email system, and we're running synchronization software. Also, I moved my journal onto my own personal Mac that I keep here, which I can just see and feel is going to be part of my daily process here. It's nice to have a Mac. No matter how much better Windows is becoming, it still sleeps and recovers MUCH worse than Macs. You can close the laptop 100% without fear of being met with black screens and strange sleep modes from which you cannot recover, and when you do, you're lucky to get WiFi again. Plus, now having become so completely multi-platform, it pains me to not have multiple platforms at my disposal for whatever reason. What about vitalization, you say? Have you ever tried to vitalize a Mac on Windows? Doable, but I gave up on that fragile hackintosh shit during my Amiga days. VMWare Fusion (and all the others) are just so mature and awesome and virtual-screen friendly on the Mac that it's insane not to use them. Windows didn't even get virtual desktops until Windows 10, hahaha! Oh, and Screenflow... ahhhh, screenflow. So much better than Camtasia Studio, it's almost sad, with everything from how it captures and displays (all) keyboard strokes, to how it almost magically combines with VMWare fusion to jump from OS to OS to OS seamlessly in the same video. Speaking of which, I really have to get back to those talking head videos sooner rather than later. Oh, get the other OSes installed on your Mac now. I have to get a Windows image, shoot. Not something for today... rabbit hole. But keep VMWare Fusion pinned to your Mac taskbar as a reminder. Your environment IS your checklist! Download Ubuntu 14.04 just to get the ball rolling. That'll be your spinning plate today. It's just over a gigabyte download. -------------------------------------------------------------------------------- ## Tue Apr 12 23:16:03 EDT 2016 ### Journaling System Like a DNA Strand Hello, I am a single-file weblog-like thingamajigger, but I do have some structure. I'm written in Markdown. View-source of this page. You might be surprised. I'm designing to to be my forever-evolving self-programming that works a lot like a DNA molecule. Because it's one long strand, there's not much navigation, but forward/back, with only each end as fixed points, and everything else relative to that. And we're a sort of Turing Machine, really. But our ribbons aren't infinite. Here's what you'll find along mine, from top-to-bottom: - HTML element wrappers - An xmp element that strapdown.js works against - Beginning of Markdown - Super-short list of things to remind yourself every day - Super-short To-Do list meant to be seen immediately above journal - Reverse-chronological journal entries, timestamped with markdown H2's - Reminders of Awesome - List of Important Lists with unique hashtags for shift-8 position jumps - Important tech to follow like IPFS and native mobile app development - Points-of-Organization list of things like wallet, file cabinet, etc. - Plenty of other meaningful and profound lists-of-things like these - Closing of the xmp element - Any JavaScript or CSS that needs to run in context of body element - Closing the HTML element We don't need no stinkin' Jekyll transformations. -------------------------------------------------------------------------------- ## Tue Apr 12 21:47:45 EDT 2016 ### Now Off To Fold My Laundry Well, I'm back at home, and I forgot to commit and push my entries at the office, and I'm going right ahead and typing here, knowing I'll be making a conflict at the office that needs to be merged. There's git Kung Fu that can force or resolve the merge conflicts, then there's just viewing source on the mikelevinseo.com website, and just copying the uncommitted & pushed work at the office into an OS copy-buffer, do a forced git pull overriding my local edits, and then paste those back into THIS branch, which is also the origin master head. Pshwew! Okay, stayed late at the office. Just getting home. Cats are fine. Deja Vu all over the place. I video conferenced with Adi from the ZD office, and let me tell you, does that high bandwidth connection make a difference! Wow, I need to make sure I answer Eva's video-calls from the Mac, and not my Note 5 to see if there was anything from using Hangout on a desktop Mac, versus my Samsung phone I've done so far. I really need to isolate the performance issues to bandwidth. If it's bandwidth, then hmmm. I don't exactly know what I'm going to do to make those video calls better. But if it's hardware or lack-of-optimization in the app on a particular platform, then I can change hardware easily enough. Okay, what now? Now, the little bit of progress you need to make is to clear your bed of the clothes you pulled out onto it this morning. Sort the easy stuff. Remember, 80/20 rule. Don't fold anything, and don't match sock, or any of that time-wasting crap. Just hang up shirts and pants, and fold all T-Shirts (the single-motion fold) and get my drawers more-or-less sorted. Don't chase the laundry rabbit down the laundry rabbit hole. Every day is another day. Get SOMETHING meaningful done, bank the results, leverage results next day, and push yourself a bit to make more-than-the-lazy-norm's worth of stuff done each day. Good advice at the office too. Gotta move faster than in Agency-mode, where for whatever reason, projects aren't as ambitious, and deliverables are not as fabulously interesting. ZD is giving me a unique opportunity there -- in fact, I spoke one-on-one with Vivek today, and wow, if I'm not tailor made for ZD at this moment in time, I don't know who is. I feel direct continuity of my most interesting past-life projects and goals and aspirations, almost as if I rolled back my career to Scala, where I should have been doing take-over-the-world stuff with a take-over-the-world team, but instead was caught up in a post-Commodore demise quagmire funk. The Amiga computer was taken away, boo hoo! And I lost all my best secret weapons, but namely AREXX for inter-process communication to automate apps like ADPro and DPaint IV. Oh, did I mention DPaint? Oh, did I mention DPaint? Oh, did I mention DPaint? Yes, DPaint played a big role in my develop