Thursday, September 24, 2015

2 spaces after a period, or one?

Now, I'm not much of a writer.  If you follow this blog, that much is probably apparent.  Also, I'm pretty old, and us old people usually started typing on typewriters, or something that outputted paper, like a word processor.

Anyway, when I started typing, typewriters just made one size character, whether you were typing "iiiiiiiiiiiiii" or "eeeeeeeeeeeeee" they came out the same width on the paper. Now, in order to make reading easier, and give more weight to sentence termination, we were taught to put two spaces after every period.

Some habits are just hard to break.

Wednesday, September 9, 2015

XenServer 6.5 fails to install with splashy graphics error

Forget the hard time I had getting a working XenServer 6.5 installer onto a USB drive. Once we finally have the installer happily occupying the entirety of my shiny new 64GiB usb flash drive, even more trouble brews.

Tried the installer on 4 different servers, different varieties of SuperMicro hardware and each one fails with an error like this:


Installation of tmp/splashy-graphics-xenserver-0.3.9-xs1393.x86_64.rpm failed.
ln: creating hard link '/usr/share/splash/background.png' to '/usr/share/splashy/themes/citrix-theme/background.png': File exists
error:
%post(splashy-graphics-xenserver-0.3.9-xs139...
Now, I don't give two turds about what splashy graphics are, and it's unlikely I'll ever need them on this headless server, but it kinda abruptly interrupts the installer process, which I do need...

After much googling, I realize that there are literally dozens of us. We all have this splashy graphics problem, and no one concisely describes the fix. So, here it is:

While the server is installing, but before it gets to 90% or so, simply hit Alt-F2 to drop to a shell.

Then delete the background.png file at /tmp/root/usr/share/splash

# rm /tmp/root/usr/share/splash/background.png

Then hit Alt-F1 to go back to the installation terminal.

Problem solved. Now, I'm off to babysit the rest of these server installs.

Sunday, August 30, 2015

The problems with Apple's iOS App Store

This is going to be a rant. If you don't like rants, click away now.

I'm writing this more for me than for you. I just need to vent this frustration, and let's face it, Apple is kind of big now, so my little concerns are easily overlooked.

Now, the core problem here is that we goofed, we messed up, we didn't properly test before releasing a patch. This can happen to anyone, but certainly, it's easier to happen when you only have 2 people and you do everything yourselves.

Long story short, version 1.0.5 of our app was working fine, humming along, selling at least a couple in-app purchases every day. Nothing to quit my day job over, but at least we'll be able to pay for the developer license, and maybe even the domain names. Along comes update 1.0.6 adding a few fixes, and more languages to the list of localized languages. Unfortunately, a bug was introduced that basically breaks the app completely. It will crash right after launch after the user encounters the bug.

Of course we noticed the problem only a few hours after the update went live. We had a patch created and pushed to the store within an hour of that, even with parental duties and it being the weekend and all...

Fast forward 8 days, and we're still waiting on Apple to approve the patch. Now, this wouldn't be so bad, if we could simply pull version 1.0.6 and revert to 1.0.5.  However, that's not even possible. It wouldn't be so bad if there were some way to fast track our patch, to tell Apple, "Hey look, we need this patched urgently, our app is functionally broken". However, that's not possible either.

So, here we are pissing off real customers, and probably losing numerous sales, and generally generating bad will, because our hands are completely tied.

It really is frustrating.

Rant is over.

If you have something constructive like, hey, here's a way to fast track your app update, or hey, you know you could do "this" and disable your bad update, I'd really enjoy that.

Thursday, August 20, 2015

Odds of 500,000 heads and 500,000 tails

If you flip a perfect coin* 1 million times, what are the odds that you would get exactly 500,000 heads and 500,000 tails?

A) 0%
B) 0.08%
C) 1%
D) 10%
E) 50%
F) 100%

Go ahead and try and explain your answer.

*perfect coin: A coin that has exactly a 50% chance of landing on either heads or tails.

Think of the simplest case. 2 coin flips. There are 4 possible outcomes, both heads, both tails, or 1 of each.

Wait, I just said 4 possible outcomes, then described 3?  Why?

Well, think of it like this:

HH, TT, HT, TH

Getting heads then tails is a different outcome than tails then heads, but both result in 1 of each. So, judging from that, you say, well 2 of 4, that's 50%, I knew the answer was E! Hah, not so fast...

Now, let's look at the next simplest case.  4 coin flips. Here are the possible outcomes:

HHHH, HHHT, HHTH, [HHTT], HTHH, [HTHT], [HTTH], HTTT, THHH, [THHT], [THTH], THTT, [TTHH], TTHT, TTTH, TTTT

So, there's 16 possibilities, and only 6 have equal numbers of heads and tails, I surrounded them in square brackets.  Uh Oh, that's 37.5% and it's probably going to get worse as we increase the flips towards 1 million. So maybe you're leaning towards answer C or D now, right?

From here I could go into the topics of binomial coefficients, pascal's triangle and other worthwhile, interesting topics, but I'll skip the heavy math and just show you dude's triangle.

     1
    1 1
  1 (2) 1   summed = (4)
 1 3   3 1
1 4 (6) 4 1 summed = (16)

Hopefully you recognize Pascal's triangle, but if you don't there's always Wikipedia.

We only care about every other case, as you need an even number of flips to obtain a perfectly even split of heads and tails.  You might recognize the numbers I put in parentheses. Those are the results we saw for 2 and 4 flips. So, now we just have to draw the next 999,996 lines of the triangle.

Don't worry, I did this on a separate sheet of paper, and the result was roughly 0.08%.

If you don't believe me, you can ask WolframAlpha


Thursday, February 19, 2015

nosetests: error: no such option: --with-xunit

Sometimes, nosetest is a liar.

For some reason, we were seeing this error in the Jenkins console output:

    nosetests: error: no such option: --with-xunit

Of course, the first thing to do is to google the problem.  Why doesn't nose like this parameter any more?  Maybe we updated versions, and now it's no longer accepted.

Well, unfortunately, that's not the case.

    nosetests --help

Tells me:

    --with-xunit Enable plugin Xunit: This plugin provides test results 
in the standard XUnit XML format. [NOSE_WITH_XUNIT]

Doh, that's not it.  However, scrolling up a few lines in the jenkins console output, I see that one of our 

    python setup.py develop 

commands was failing miserably, but the error wasn't being caught.  

I saw this in the console output:

    Couldn't find index page for 'our_thing' (maybe misspelled?)
    No local packages or download links found for our-thing
    error: Could not find suitable distribution for Requirement.parse('our-thing')

Fixed that problem, and nosetest were back in business.  Of course, we had lots of broken tests, because the nosetests error wasn't being caught as a failure by Jenkins.  Stupidity on top of idiocy.

<sigh>

Thursday, October 2, 2014

T-Mobile vs AT&T 7 day Test Drive

Originally, this blog wasn't going to have any code in it. I was just going to tell you the result, but what fun would that be. Also, it wouldn't really fit the theme of the blog.

Disclaimer: I've been an AT&T cell subscriber since 1999 when they were still called Cingular Wireless.


That said, you might think I'm biased, right? But which way?

Anyhow, T-Mobile was offering an opportunity to test drive their network. Basically, they send you an iPhone 5S in the mail, and you get to try it out for a week. Luckily, for the sake of science, my personal phone is also an iPhone 5S, but on the AT&T network.

The plan was fairly simple, I'd run Ookla's speed test app at a number of different locations that I commonly travel to. My daily commute takes me from the Far East Bay to downtown Oakland, and I also went to Berkeley for soccer practice. I was expecting to just eyeball it, and if the conclusion was, "Hmm, well, it's pretty close" I'd be tempted to switch from AT&T to T-mobile.

Then I discovered that the speedtest app will let you email a csv file containing all your test results, so of course I had to take advantage of that and put the results under further scrutiny.

So, I wrote up about 60 lines of throw away python code to evaluate the results. The way I graded the results was in three parts:

If one network was faster in both download and upload speeds, we'd call that a win. If one network won upload and the other download we'd call that mixed results.

The second criteria I looked at was simply average upload and download speeds across all the tests.

The third criteria was how many dead-zones with little to no network speed I found. I played around with the thresholds for what I considered to be a "dead-zone".

There are probably a lot more interesting things I could do with the data, and I may yet do so, as I'm only on day 4 of the test.

Here is the throwaway python code I used:
import csv
import os

with open('data' + os.sep + 'att.csv', 'rb') as a:
    a_data = csv.reader(a)
    att_data = []
    for row in a_data:
        att_data.append(row)

with open('data' + os.sep + 'tmobile.csv', 'rb') as t:
    t_data = csv.reader(t)
    tmobile_data = []
    for row in t_data:
        tmobile_data.append(row)

att_wins = 0
tmobile_wins = 0
mixed = 0

att_sum_dl = 0
tmobile_sum_dl = 0
att_sum_ul = 0
tmobile_sum_ul = 0

total_comparisons = 0

att_dead_zones = 0
tmobile_dead_zones = 0
both_dead_zones = 0
dl_dead_threshold = 1000
ul_dead_threshold = 500

for a_row in att_data:
    for t_row in tmobile_data:
        if a_row and t_row:  # Ignore blank lines
            if a_row[0] == t_row[0] and a_row[0] != 'Date':  # if times match exactly
                if int(a_row[4]) > int(t_row[4]) and int(a_row[5]) > int(t_row[5]):
                    att_wins += 1
                elif int(a_row[4]) < int(t_row[4]) and int(a_row[5]) < int(t_row[5]):
                    tmobile_wins += 1
                if int(a_row[4]) > int(t_row[4]) and int(a_row[5]) < int(t_row[5]):
                    mixed += 1
                if int(a_row[4]) < dl_dead_threshold or int(a_row[5]) < ul_dead_threshold:
                    if int(t_row[4]) < dl_dead_threshold or int(t_row[5]) < ul_dead_threshold:
                        both_dead_zones += 1
                    else:
                        att_dead_zones += 1
                if int(t_row[4]) < dl_dead_threshold or int(t_row[5]) < ul_dead_threshold:
                    if int(a_row[4]) < dl_dead_threshold or int(a_row[5]) < ul_dead_threshold:
                        pass
                    else:
                        tmobile_dead_zones += 1

                att_sum_dl += int(a_row[4])
                tmobile_sum_dl += int(t_row[4])
                att_sum_ul += int(a_row[5])
                tmobile_sum_ul += int(t_row[5])
                total_comparisons += 1

print 'AT&T wins:     ' + str(att_wins)
print 'T-mobile wins: ' + str(tmobile_wins)
print 'mixed results: ' + str(mixed)
print 'AT&T average download:     ' + str(att_sum_dl // total_comparisons)
print 'T-mobile average download: ' + str(tmobile_sum_dl // total_comparisons)
print 'AT&T average upload:       ' + str(att_sum_ul // total_comparisons)
print 'T-mobile average upload    ' + str(tmobile_sum_ul // total_comparisons)
print 'AT&T deadzones:     ' + str(att_dead_zones)
print 'T-mobile deadzones: ' + str(tmobile_dead_zones)
print 'both deadzones:     ' + str(both_dead_zones)
And here are the results:
AT&T wins: 13
T-mobile wins: 39
mixed results: 22


AT&T average download: 14926
T-mobile average download: 17747
AT&T average upload: 6495
T-mobile average upload 10466

AT&T deadzones: 7
T-mobile deadzones: 6
both deadzones: 4
As you can see, it was pretty much a slaughter.

One thing to note was that Berkeley seemed to have better AT&T coverage than T-mobile.
Also, the only place along the BART line from Dublin/Pleasanton to 12th Street Oakland that AT&T wins is at the Oakland Coliseum stop.

Thursday, September 25, 2014

XenServer 6.2 and CentOS 6.5 frustration...

So I've just provisioned a brand new XenServer and patched it all the way to the hilt.

Next step of course is to create some sweet new VMs on my little virtualization platform.

I go through the most basic of steps, any 12 year old sysadmin could navigate...

Of course I'll Install CentOS 6.5 from a URL, because who wants to keep an ISO library on hand, am I right?

Well, my mistake is captured in this screen shot, and if you can see it immediately, you should have a bright future as an IT monkey.



How this error manifests itself is equally elusive, and it took me longer to discover the root cause than it took to write this blog. Those type of issues are usually blog worthy.

Anyhow, back on topic. The error presented to me looks like this:

Sep 25, 2014 12:26:34 PM Error: Starting VM 'CentOS 6 (64-bit) (1)' - Internal error: xenopsd internal error: XenguestHelper.Xenctrl_dom_linux_build_failure(2, " panic: xc_dom_core.c:616: xc_dom_find_loader: no loader\\\"")


Well, technically, it shows up in this hard to copy status area of XenCenter.



However, you can find this in the server's log and then copy the text from there. Hah, of course, now I've got an actual error string, and Google, problem solved, right? Hecky nah.

Googling this error will lead you down a nefarious twisted trail of hopelessness, especially considering how simple the cause and effect were in my case.

30 minutes of searching for bootloaders gone missing, corrupt device drivers, corrupt disk drives, and little green men from outer space led me no where good.

Finally, I realized the problem. If you see where the cursor is in that first screen shot, there is simply some white space at the end of the Installation URL. That simple mistake caused this avalanche of bizarre errors. I'm just so glad I found the root cause, because these type of issues are nothing but aggravation.