Tuesday, March 28, 2017

Another Bucket List Item

I’ve put together a list of items I hope to accomplish before we leave for Romania.  As a music guy, of course that includes a list of bands and concerts I’d love to see before I go.  I just received a notification that one of the bands, OK Go, is coming to Baltimore in June!

In case you who don’t know, OK Go is an incredible band that rose to fame through their now infamous dancing treadmill video for Here We Go Again.  I found them through other means when they chose to go their own way and publish music as an independent artist instead of through a major label.  I’ve since grown to love their music and appreciate their artistry.  Below is their most recent video.  It’s worth checking out and I’d also recommend watching the making of videos.

For reference, here’s the list of shows I’m hoping to see before I leave:

For those interested, I have a history with Darlingside and Jamie Kent (albeit a bit loose … cue Weird Al Yankovic’s Lame Claim to Fame) in that they toured together back in 2012.  One stop they made was Ebenezers Coffeehouse here in DC.  I was front of house for that show, so now I can be “that guy” who hollers “I knew them when …”.

On another subject, we do get to go see Empire of the Sun at Echostage as well and that should be a fun show too.  I also got to see Save Ferris a month or so ago at the Black Cat.

Thursday, January 05, 2017

NAS Update

I’m about two weeks into having the NAS and so far so good.  I’ve spent much of the last two weeks sorting out all my pictures (see my other blog posts about that process) in preparation for the big move.  I’d spent so much time at one point or another making duplicates or moving files around that I had at least two copies of almost every picture I’d ever taken.  Needless to say, that wasn’t really sustainable long term.  It also meant that my file server had more than its fair share of extra files on it.  Now I’m at the point where I have, basically, one good copy of all my pictures.  Oh, and I’ve held onto the iPhoto libraries from all the way back too, just in case.

This means that I was able to move some of my three terabyte Western Digital Red drives into the NAS.  Of course, in true moving-a-bit-too-fast-for-my-own-good style, I grabbed the wrong drives out of the server only to have it boot up and tell me that it had no working arrays … whoops.  Fortunately for me, the Synology’s Diskstation Manager (DSM) didn’t immediately grab the drives and try to reformat them.  As such, I just stuck the correct drives back into the old server, grabbed the right drives, booted the system and all was well.

With the additional drives in the new NAS, it was time to add them to the array.  Right now (24 hours later), it’s at 25% expansion of the array.  It’s definitely taking longer than I would have expected to complete the expansion, but I’m hoping that’s a one-time deal.  It is something to be aware of, though, if you plan on expanding an array in a hurry.

A few other notes:

  • I currently use my old server as an iTunes server.  No worries, DSM offers both Plex and a native iTunes server.  The issue I’m seeing right now is the built-in server doesn’t support playlists, other than smart playlists.  Given that I have a playlist I’ve played for the kids almost every night since my son was born, not having access to that capability is a bit of a pain.  To top it off, I can’t seem to get Plex audio playlists to show up on my Roku. Update: Turns out I was wrong, I just needed to install the Synology Audio Station to create static play lists. With that, and the Video Station, I’ve uninstalled Plex for now.
  • The web based UI is decent.  It’s still a web-based UI, but overall it seems competent.  There are places where I can’t always seem to find what I’m looking for, but that’s how it goes.  The biggest issue for me is the split between items you control in the control panel, items you need to go to the package manager to manage and “applications” that appear in the app start menu thingy (see the UI below with the menu expanded).  The big one was the Storage Manager application.  I had to go there to change how the storage was managed.  An OK thing once you know about it, but, to me, that belongs in the control panel.  I guess that’s a minor gripe, because now that I know where it is …
  • Synology also offers Android and iOS applications for certain key features, such as photo, video and music browsing.  I haven’t spent too much time with those yet, but they look interesting and may provide a decent alternative to Plex.  And they have Google Cast capabilities built in!

NewImage

Thursday, December 29, 2016

A new NAS

After 10 years (or so) of using a PC as a file server, I finally decided enough is enough.  The size, noise and upkeep of my old PC was just a bit too much for me to want to deal with any longer.  Not that there’s an issue with using a PC as a file server, but it just seems, now, a bit overkill.

My original goal when I build the PC server was to use virtualization in order to run various servers and experiment with various technologies.  While that worked for a time, the amount of memory I needed ended up growing beyond what I have and the price to upgrade became too much for what is basically an eight-year-old machine.

The size also became a bit too much to handle.  When I built the machine originally, I bought a huge full tower case.  I assumed that I would ultimately need multiple HDDs and a big power supply.  Had I stuck with the virtualization, then this may have made more sense.  Now, it’s just a noisy boat anchor that serves (pun intended) as a file and iTunes server.

One other reason to move away starts with the advent of streaming music services (Google Play, which I use).  I now rely much less on the iTunes sharing than ever before.  My kids have a bedtime play list they use every night to go to sleep, but other than that (and the occasional Sonos use), I don’t rely on it as heavily as before.

With this in mind, I started looking at stand alone Network Attached Storage devices.  I like the idea of a bespoke device to manage this setup.  I also like the size of the device as opposed to beastie I have now.  The new devices also appear to offer more functionality and can, basically, replace a lot of the things I needed a stand alone PC for before.

If you’re curious, I ended up with a Synolog DS416play.  It was slightly more expensive than I was planning on, but it does offer better media capabilities, including hardware-based media transcoding.  This was something I’d tried on the old system and it just couldn’t keep up.  I’m hoping to dump much of my media on there and share them out to my Chromecast and Roku devices.

As I spend more time with it, I’ll post more.

Smart House

Earlier this year, I finally broke down and purchased a Samsung SmartThings home automation hub.  I had wanted to get one for about two years after hearing about them advertised on the TWiT network. I did wait a bit until I had a bit of spare cash and for the SmartThings 2.0 hub to come out.  The big draw for the 2.0 was some limited local network-only remote control.

Anyway, it started with just a Z-wave front door lock and has begun to spread.  I now have several receptacle and smart appliance switches, garage door opener, door sensor and a number of Aeotec water sensors.

In fact, those just saved me today when we had a small water problem in the basement.  Somehow the hot water was left on and that caused a poorly installed trap to slip just enough to let water start leaking onto the floor.

When I bought the sensors originally, that was the very first place I intended to put one because this has happened to me before.  The sink is a pedestal sink which makes it hard to get access to the trap in case of a leak.  Well, when it happened before it took me a while to catch it and I’ve been a bit paranoid about it happening again since.

Well, this time the sensor saved me.  The app first indicated a water problem in the basement at 1:34pm.  By 1:45ish, I’d fixed the problem and cleaned up the mess.

Oh and thanks for the assist goes to Pushbullet.  I’ve been using it for a number of years to handle notification mirroring from my Android phones to Chrome on my PCs.  In this case, the notification popped up on my laptop while I was doing something else, so I was able to immediately react.  For lack of that, I may not have caught it until a bit later.  Come to think of it, I may want to update the action on the water problems to include an email …

Tuesday, December 27, 2016

More Details on Photo Sorting

After about three weeks of work, I believe I’ve identified almost 167,000 unique pictures (give or take a thousand or so) and over 250k duplicates.  Right now, OneDrive is struggling to determine what of the myriad of changes I’ve made are valid and trying to sync that up to the cloud.  I sort of expect that to take quite some time.

I still haven’t resorted them based on the Year/Month/Day/Model mode I mentioned earlier, but I do plan on doing that eventually.  First I need to let this sync happen and then I plan on identifying which cameras belong to us, family and friends so I can separate ours/theirs/etc.  Once that’s done, then I’m going to slowly perform the final reorgs of Mobile/Regular and apply the new sort.

I’m also considering whether I should separate out all the videos or not.  Part of me says yes, part of me says no.  Having the videos separate might make it easier to prepare videos and other multimedia things in the future, but there’s also something nice about having the pictures/videos intermingled and can lead to some neat discoveries.

I’ll post more as I go.

Sunday, December 18, 2016

Sorting out Photos Revisited

Since my last post, I’ve continued running the script to process my images and am getting close to being done.  Given that, there are a few things I would like to have done differently:

  • Right now the script organizes pictures as: <model>/<year>/<month>/<day>/<picture/video>  While that works ok, it does mean that to find a specific picture I need to know which camera model took it. I’m starting to think I should have turned that around: <year>/<month>/<day>/<model>.  At least then I would only need to know the approximate day they were taken.  However, I do like the idea of grabbing all mobile phone pics/video and being able to move them en masse.
  • Videos don’t, by default, include the camera model.  Not really sure why this is, but that does make it kind of annoying to sort out.   I have to build some strange rules/heuristics/guesses to determine which camera took what picture.  Maybe not an issue for deduping, but definitely a pain if I want to keep them separated into Mobile/Non-Mobile.
  • MacOS’s mdls command does a great job at dragging the capture date from videos, but only if the file is mounted locally on the Mac itself.  This includes FAT32 and other Windows-based file systems.  If you mount the same file system via NTFS, it gets confused.  As such, I’ve had to leave behind AVI files so I can pull them local later.  (Though, as I think of it, maybe mediainfo would work for AVI files … not sure why I didn’t try that)
  • My previous script only looks at the first conflict, meaning that once I have a single non-duplicated file, any more duplicates may be copied repeatedly.  That’s my next correction.

Overall, the process has been a bit of a pain in the back side and definitely slow, but I’m almost to the point where I have one canonical copy of all my pictures.  (Well, actually two because I’m working off my local OneDrive mirror).

Sunday, December 11, 2016

Sorting Out Photos

My wife and I are (or at least were) shutter bugs of a sort.  At this moment, I have just a bit shy of 1.5 terabytes of photos that she and I have take over the years.  I’ve also managed to make a hash of them with copies, duplicates and the occasional “I think I have this somewhere, but I can’t say for certain” directory.

I’ve been looking for the past year or two for a solution and still haven’t really found one I liked, so like a good nerd, I’ve rolled my own.  It’s cobbled together using BASH, ImageMagick, dcraw and MediaInfo.

My primary goal was to make sure that I had one copy of every file, not necessarily one high quality version of each picture or video.  Meaning, that if I end up with duplicated of a picture in RAW, high res JPEG and a JPEG thumb, I’m ok with that.  Once I have the initial culling of the photos, then I make take another swipe at further deduping it.

Anyway, my script starts by recursively looping through the current path and all subdirectories.  If it encounters a file, it will retrieve the extension and then conditionally call some combination of the above utilities to retrieve the creation/capture/modified timestamp and the camera make/model.  It does this fairly well, but there are some major caveats which I’ll discuss in a bit.

Once it has retrieved the above, it starts creating the following folder structure:

/<camera>/<year>/<month>/<day>

It then takes the file and tries to copy it into the following:

/<<camera>/<year>/<month>/<day>/<year><month><day><hour><minute><second>.<#>.<ext>

This should, in theory, allow me to identify a specific picture taken by a specific type of camera at a specific moment in time.  The initial issue I ran into with this is around time resolution.  The timestamps given to me by the various tools only resolve to the second (not millisecond like I’d prefer).  This means that if you have a camera that can take multiple pictures per second, then you can easily end up with duplicates, hence the <#> at the end.

If I encounter a file that is the same timestamp, I then do a MD5 sum on both files to confirm they are actually the same time.  If they are, then off to a duplicates tree the file goes.  If they aren’t the same, then I start an auto increment pass until I can write the file out uniquely in the target folder.

One issue, though, is that if the 0 file doesn’t match, I don’t check for subsequent matches, so the script could easily end up with files 1 2 3 and 4 all being duplicates.  Maybe I’ll try and fix that in a future edit.

As for the tooling, I use the following:

  • BASH
  • ImageMagick’s “identify --verbose”  command to get information a JPEGs
  • MediaInfo for details on MP4/M4V/AVI/MOV files
  • dcraw for details on RAW files (such as Nikon’s NEF/NRW)

Probably one of the biggest issues I have is that while your typical JPG/NRW/NEF file includes the camera details, a video typically does not. That means that I’m a bit hard pressed to determine what camera took a specific video.  I also found that the camera metadata for when the file was captured isn’t always that useful, so there are limits.

One other thing to note: ImageMagick isn’t always that fast, so there’s room for improvement on this, specifically around JPEG processing.  I was hoping to use MacOSs mdls for getting the camera data, but that only works if the filesystem is local (not mounted like mine was).

If you’re a BASH expert, please be kind.  I’m good at programming, not always good at scripting.  Otherwise, help yourself.

#!/bin/bash
BASE="/Volumes/e/Pictures/Processed"

moveFile()
{
#    local SOURCE="$1"
#    local BASE="$2"
#    local EXT="$3"
#    local SUFFIX="$4"
   
    if [ -f "$2.$4.$3" ] ; then
        moveFile "$1" "$2" "$3" $(($4 + 1 ))
    else
        mv -n "$1" "$2.$4.$3"
    fi
}

moveNonDuplicateFile()
{
    mkdir -p "$BASE/Sorted/$2/$3/$4/$5"

    local T="$BASE/Sorted/$2/$3/$4/$5/$3$4$5$6$7$8"

    moveFile "$1" "$T" "$9" "0"
}

#        moveDuplicateFile "$1" "$BASE" "$CAMERA" "$YEAR" "$MONTH" "$DAY" "$HOUR" "$MINUTE" "$SECOND" "$EXT"
moveDuplicateFile()
{
    mkdir -p "$BASE/Duplicates/$2/$3/$4/$5"

    local T="$BASE/Duplicates/$2/$3/$4/$5/$3$4$5$6$7$8"

    moveFile "$1" "$T" "$EXT" 0
}

processMOV()
{
    TIMESTAMP=`mediainfo "$1" | grep "Encoded date" | head -n 1 | sed 's/Encoded date//' | awk '{$1=$1;print}' | sed 's/: //'`
   
    YEAR=`date -ujf "%Z %Y-%m-%d %H:%M:%S" "$TIMESTAMP" +%Y`
    MONTH=`date -ujf "%Z %Y-%m-%d %H:%M:%S" "$TIMESTAMP" +%m`
    DAY=`date -ujf "%Z %Y-%m-%d %H:%M:%S" "$TIMESTAMP" +%d`

    HOUR=`date -ujf "%Z %Y-%m-%d %H:%M:%S" "$TIMESTAMP" +%H`
    MINUTE=`date -ujf "%Z %Y-%m-%d %H:%M:%S" "$TIMESTAMP" +%M`
    SECOND=`date -ujf "%Z %Y-%m-%d %H:%M:%S" "$TIMESTAMP" +%S`
   
    if [ -z "$CAMERA" ] ; then
        CAMERA="MOV"
    fi
}

processRAW()
{
    TIMESTAMP=`dcraw -i -v "$1" | grep Timestamp | sed s/Timestamp\:\ //`
   
    YEAR=`date -jf "%a %b %d %H:%M:%S %Y" "$TIMESTAMP" +%Y`
    MONTH=`date -jf "%a %b %d %H:%M:%S %Y" "$TIMESTAMP" +%m`
    DAY=`date -jf "%a %b %d %H:%M:%S %Y" "$TIMESTAMP" +%d`
   
    HOUR=`date -jf "%a %b %d %H:%M:%S %Y" "$TIMESTAMP" +%H`
    MINUTE=`date -jf "%a %b %d %H:%M:%S %Y" "$TIMESTAMP" +%M`
    SECOND=`date -jf "%a %b %d %H:%M:%S %Y" "$TIMESTAMP" +%S`
   
    CAMERA=`dcraw -i -v "$1" | grep 'Camera:' | awk -F\: '{ print $2 }' | tr '[:lower:]' '[:upper:]' | awk '{$1=$1;print}'`
}

processAVI()
{
    TIMESTAMP=`mdls "$1" | grep kMDItemContentCreationDate | sed 's/kMDItemContentCreationDate     = //'`
   
    if [ "$TIMESTAMP" == "" ] ; then
        YEAR="0000"
        MONTH="00"
        DAY="00"
        HOUR="00"
        MINUTE="00"
        SECOND="00"
    else
        YEAR=`date -jf "%Y:%m:%d %H:%M:%S %z" "$TIMESTAMP" +%Y`
        MONTH=`date -jf "%Y-%m-%d %H:%M:%S %z" "$TIMESTAMP" +%m`
        DAY=`date -jf "%Y-%m-%d %H:%M:%S %z" "$TIMESTAMP" +%d`

        HOUR=`date -jf "%Y-%m-%d %H:%M:%S %z" "$TIMESTAMP" +%H`
        MINUTE=`date -jf "%Y-%m-%d %H:%M:%S %z" "$TIMESTAMP" +%M`
        SECOND=`date -jf "%Y-%m-%d %H:%M:%S %z" "$TIMESTAMP" +%S`
    fi
   
    if [ -z "$CAMERA" ] ; then
        CAMERA="AVI"
    fi
}

processJPG()
{
#    TIMESTAMP=`mdls "$1" | grep kMDItemContentCreationDate | sed 's/kMDItemContentCreationDate     = //'`
#    CAMERA=`mdls "$1" | grep kMDItemAcquisitionModel | sed 's/kMDItemAcquisitionModel        = \"//' | sed s/\"//`

    TIMESTAMP=`identify -verbose "$1" | grep DateTimeDigitized | sed 's/    exif:DateTimeDigitized: //'`
    TIMESTAMP="$TIMESTAMP -0000"
    CAMERA=`identify -verbose "$1" | grep "exif:Model" | sed 's/    exif:Model: //'`

    if [ "$TIMESTAMP" == " -0000" ] ; then
        TIMESTAMP=`identify -verbose "$1" | grep "date:modify" | sed 's/    date:modify: //' | sed 's/\(.*\)-\(.*\)-\(.*\)T\(.*\)\([+-]\)\(.*\):\(.*\)/\1:\2:\3 \4 \5\6\7/'`
        #TIMESTAMP=`mdls "$1" | grep kMDItemFSContentChangeDate | sed 's/kMDItemFSContentChangeDate = //'`
    fi
   
    # | awk '{$1=$1;print}'`
       
    #echo $TIMESTAMP / $IMAGE
   
    # 2014-07-05T11:12:16-04:00
   
    if [ "$TIMESTAMP" == "" ] ; then
        YEAR="0000"
        MONTH="00"
        DAY="00"
        HOUR="00"
        MINUTE="00"
        SECOND="00"
    else
#         YEAR=`date -jf "%Y-%m-%d %H:%M:%S %z" "$TIMESTAMP" +%Y`
#         MONTH=`date -jf "%Y-%m-%d %H:%M:%S %z" "$TIMESTAMP" +%m`
#         DAY=`date -jf "%Y-%m-%d %H:%M:%S %z" "$TIMESTAMP" +%d`
#    
#         HOUR=`date -jf "%Y-%m-%d %H:%M:%S %z" "$TIMESTAMP" +%H`
#         MINUTE=`date -jf "%Y-%m-%d %H:%M:%S %z" "$TIMESTAMP" +%M`
#         SECOND=`date -jf "%Y-%m-%d %H:%M:%S %z" "$TIMESTAMP" +%S`
        YEAR=`date -jf "%Y:%m:%d %H:%M:%S %z" "$TIMESTAMP" +%Y`
        MONTH=`date -jf "%Y:%m:%d %H:%M:%S %z" "$TIMESTAMP" +%m`
        DAY=`date -jf "%Y:%m:%d %H:%M:%S %z" "$TIMESTAMP" +%d`

        HOUR=`date -jf "%Y:%m:%d %H:%M:%S %z" "$TIMESTAMP" +%H`
        MINUTE=`date -jf "%Y:%m:%d %H:%M:%S %z" "$TIMESTAMP" +%M`
        SECOND=`date -jf "%Y:%m:%d %H:%M:%S %z" "$TIMESTAMP" +%S`
    fi
   
   
    if [ -z "$CAMERA" ] ; then
        CAMERA="Unidentified"
    fi
}

processMPEG4()
{
    TIMESTAMP=`mediainfo "$1" | grep "Encoded date" | head -n 1 | sed 's/Encoded date//' | awk '{$1=$1;print}' | sed 's/: //'`
   
    YEAR=`date -ujf "%Z %Y-%m-%d %H:%M:%S" "$TIMESTAMP" +%Y`
    MONTH=`date -ujf "%Z %Y-%m-%d %H:%M:%S" "$TIMESTAMP" +%m`
    DAY=`date -ujf "%Z %Y-%m-%d %H:%M:%S" "$TIMESTAMP" +%d`

    HOUR=`date -ujf "%Z %Y-%m-%d %H:%M:%S" "$TIMESTAMP" +%H`
    MINUTE=`date -ujf "%Z %Y-%m-%d %H:%M:%S" "$TIMESTAMP" +%M`
    SECOND=`date -ujf "%Z %Y-%m-%d %H:%M:%S" "$TIMESTAMP" +%S`

    if [ -z "$CAMERA" ] ; then
        CAMERA="MP4"
    fi
}

processFile()
{
    echo "Processing file $1"
   
    local EXT=`echo "$1" | sed 's/.*\.\([A-Za-z0-9]*\)/\1/' | tr '[:lower:]' '[:upper:]'`

    case $EXT in

        # Picture Formats Here

        NEF)
            processRAW "$1"
            ;;

        NRW)
            processRAW "$1"
            ;;

        JPG)
            processJPG "$1"
            ;;

        # Media Formats Here

        AVI)
            processAVI "$1" "$EXT"
            ;;

        MOV)
            processMOV "$1" "$EXT"
            ;;

        MP4)
            processMPEG4 "$1" "$EXT"
            ;;

        M4V)
            processMPEG4 "$1" "$EXT"
            continue
            ;;
           
        DB)
            rm "$1"
            return 0
            ;;
           
        PNG)
            rm "$1"
            moveFile "$BASE/Other/PNG" "$2" 0 "PNG"
            continue
            ;;
           
        PANO)
            rm "$1"
            moveFile "$BASE/Other/PANO" "$2" 0 "PANO"
            continue
            ;;
           
        \*)
            continue
            ;;

        DS_STORE)
            rm "$1"
            ;;

        *)
            echo Unmaped extension $EXT
            continue
            ;;

    esac

    if [ -z "$YEAR" ] ; then
        echo "Image with no YEAR"
        continue
    fi

    if [ -z "$MONTH" ] ; then
        echo "Image with no MONTH"
        continue
    fi

    if [ -z "$DAY" ] ; then
        echo "Image with no DAY"
        continue
    fi

    if [ -z "$HOUR" ] ; then
        echo "Image with no HOUR"
        continue
    fi

    if [ -z "$MINUTE" ] ; then
        echo "Image with no MINUTE"
        continue
    fi

    if [ -z "$SECOND" ] ; then
        echo "Image with no SECOND"
        continue
    fi

    TARGET="$BASE/Sorted/$CAMERA/$YEAR/$MONTH/$DAY/$YEAR$MONTH$DAY$HOUR$MINUTE$SECOND.0.$EXT"
   
    if [ -f "$TARGET" ] ; then
   
        SOURCEHASH=`md5 -r "$1" | awk '{ print $1; }'`
        TARGETHASH=`md5 -r "$BASE/Sorted/$CAMERA/$YEAR/$MONTH/$DAY/$YEAR$MONTH$DAY$HOUR$MINUTE$SECOND.0.$EXT" | awk '{ print $1; }'`

        if [ "$SOURCEHASH" == "$TARGETHASH" ] ; then
            moveDuplicateFile "$1" "$CAMERA" "$YEAR" "$MONTH" "$DAY" "$HOUR" "$MINUTE" "$SECOND" "$EXT"
        else
            moveNonDuplicateFile "$1" "$CAMERA" "$YEAR" "$MONTH" "$DAY" "$HOUR" "$MINUTE" "$SECOND" "$EXT"
        fi
    else
        moveNonDuplicateFile "$1" "$CAMERA" "$YEAR" "$MONTH" "$DAY" "$HOUR" "$MINUTE" "$SECOND" "$EXT"
    fi

#    mv -n "$1" "$TARGET"
}

processDirectory()
{
    echo "Processing dir  $1"
   
    cd "$1"

    YEAR=""
    MONTH=""
    DAY=""
    HOUR=""
    MINUTE=""
    SECOND=""
    CAMERA=""

    for FILE in * ; do
   
        if [ -d "$1/$FILE" ] ; then
            processDirectory "$1/$FILE"
#            rmdir "$1/$FILE"
        else
            processFile "$1/$FILE" "$FILE"
        fi
   
    done

    if [ -f ".DS_Store" ] ; then
        rm .DS_Store
    fi   

    cd ..
    rmdir "$1"
}

CURRENT=`pwd`

processDirectory "$CURRENT”

Trying to be a bit more social

So, I know this blog hasn’t been active in a while.  Anyone that knows me will know that I’m not necessarily that big into social media.  I like the concept, but I tend to get a bit too busy to bother posting.  Oh, and don’t even bother looking for me on Facebook.  I’m not there, at least not really.

However, given that I do have this blog and this little spot in the world, I thought I’d give it another go.  No promises for sure.

Monday, June 18, 2012

It's Been Quiet Around Here

Today marks day day 60 that Felicia and Alex were over in Romania.  When they first took off, I wasn't really sure what to expect.  And from what I can remember, neither was Feli.  I remember at the time thinking that 10 weeks was such a long time for them to be away.  Don't let anyone kid you, it is quite a long time for them to be away.  But I think it's also been a good time.

I do know that Alex has had a grand time in Romania with his grandparents.  And I also know that they have had a good time having him around.  What's funny is watching him via Skype and videos that Feli takes of him.  I think he's learned quite a bit of Romanian while he was there.  Not speaking it, but definitely understanding it when spoken to him.  It'd be rather interesting to determine who knows more, me or him.  I think I'd win that one, but only by a small bit.

If you've been reading the blog or following the trip on Friendface, then I'm sure you've seen just how much he's grown over the past two months.  I know that when I was there, he'd grown a bit, but over the last 4 weeks he really seems to have grown.  And not just grown, but matured.  He no longer looks like a baby, but looks like a little boy.

I will say that this time apart would definitely have been more painful if it weren't for the trip I took over there for 8 days.  In retrospect, I should have taken a bit more time there (especially since it wouldn't have cost much more and could have happened in and around Memorial Day here in the States), but at least I as able to see them.  I think that 10 full weeks would have been almost unbearable.

At least while they were away I was able to get some work done on the house.  RIght now, I'm sitting in Alex's room in the glider rocker typing this blog entry.  As I type this, I'm surrounded by not only his stuff, but a decent percentage of the furniture and stuff from the living and dining rooms.  Why? Well, that's because while they were gone I had both the upstairs and the main level's wood floors done.  The upstairs was a complete sand down and refinish (and man do they look different  … you can see pictures on my Google+ account if you're interested).  

The main level wasn't quite as invasive, though they were in much better shape too.  Instead of a complete sand down, these were just screened and poly's.  In fact, it was meant to be a single coat, but due to a technical glitch, our floor guy (if anyone needs a their floors done in DC, let us know, we can definitely recommend him) had to come back and redo the work.  So now the main level has two coats of fresh polyurethane on it and do they look shiny.

So, now as the last 10 days speed past I can look forward to having squirbles and @felioland back from abroad.  It's been a good time apart and now I'm ready to have them back home.  But if you happen to think of them on Wednesday night and/or Thursday morning, could you say a little prayer for them?  I'm not going to be able to help them as they travel back to the US and I'm sure the flight won't be the most fun thing they've ever done.  I'm sure Feli will have something to say about it here in a little over a week.

Cross posted from Alex's blog

My mobile phones

So, I saw an article today on The Verge asking what people's mobile phone timeline was.  Looking at some of the older phones people mentioned in the comments got me thinking about my phone history.  So, just for fun, here's mine:

Ericsson T28 (2000)

T28

So, my first phone, that I bought myself, was an Ericsson T28 world phone.  I still remember buying this at the Best Buy in Rockville just off I-270.  I went with it because it was a GSM-based world phone that I suspected would work in Europe.  I would have been right, but I didn't realize they SIM-locked phones.  So, instead I used it when I was home and used something different over in the UK.  In fact, my first real phone was also an Ericsson T10, I think.  However, since this was a pre-paid phone that was loaded me by eGrail, I'm not absolutely certain it counts.

Nokia 8890 (2001)

Nokia88901

I still think this phone was one of the coolest phones I ever owned.  I bought this little guy in the UK so I would have a phone of my own and to have when I came home.  It came unlocked and worked on the 1900 MHz frequencies that T-Mobile (then VoiceStream) used in the US.  I carried that little guy all over Europe with me.  Unfortunately, it developed a crack along the speaker (just above the Nokia logo).  This caused the screen to basically stop working unless I squeezed on the top really hard.  It was too bad too as I really did like that little phone.  In fact, I still have it today, though I doubt it would even turn on if I had to.

Sony Ericsson T610 (2003)

1535c15d c63e 4b7b aa15 9e69eacc7785

My next phone was also a candy bar style phone.  At the time, flip phones were all the rage, like my older T28.  However, I wasn't a real fan of the flip phones.  For some reason, they never felt right to me.  The smaller, more svelte phones like the T610 always suited me better.  This one was no exception.  I really did like that phone and kept on using it until I decided I had to have a smart phone.  In fact, I really do believe I have this one still too.  It may even be unlocked.  I don't remember.

T-Mobile MDA (2005)

Tmobile mda views1

 

After a spate of "feature" and candy bar phones, I upgraded to a true nerd phone, the T-Mobile MDA.  This was a Windows Mobile 5.0-based phone.  Sometimes I do miss this particular phone.  Granted, my new phones are much more full featured and definitely have a better collection of applications available for it, but this phone had one thing my newer phones don't: a keyboard.  And it was a decent keyboard too.  By the time I'd had it for a bit over 2.5 years, I'd become quite adept at typing on it.  Too bad the browser stunk and towards the end it would lock up for no real good reason.

iPhone 3G (2008)

Appleiphone3g

So the story goes like this.  I've been a T-Mobile user since 2000 and really didn't want to switch.  I needed a new phone, quite badly as my phone (the aforementioned MDA) was really flaking out.  In fact, our last night in Rome saw the phone basically throw an electronic hissy-fit.  It decided that the alarm needed to go off and keep going off no matter how many times I rebooted.  That was basically the final straw on this one.

I assumed that I'd end up with the T-Mobile G1, the first Android-based phone on the market.  My plan was to head to the closest T-Mobile store and pick one up once I got back the States.  However, a trip to duty free at the airport sort of derailed me.  They had iPhone 3Gs on sale and unlocked.  Granted the price was €699, so quite steep, but at least I could use it on my current contract and service.  With a little prodding from @felioland, I grabbed it and used it for a bit over 2 years.

It worked pretty good, though getting it to play nice with the T-Mobile network was a bit of work and it did take a bit of time to track down the right settings.  And, of course, it never would with on 3G as it was never compatible with the T-Mobile US frequencies.  But I did like the phone.

Samsung Nexus S

Nexuss phone google

And now to my current phone, the Samsung Nexus S.  Now, this phone I truly love.  The form factor is great and the OS, Android 4.0 Ice Cream Sandwich (not pictured above), makes for a great phone experience.  Also, since it's a true T-Mobile phone, it offers decently fast speeds.  Fast enough to watch Netflix via WiFi sharing.  This almost consistently beats out hotel Wifi for performance.  It also came unlocked, something I didn't realize until I was trying to get T-Mobie to unlock my phone.  It took the rep chatting with her manager to realize that it was never locked to begin with.

Saturday, July 09, 2011

Google Nexus S

So back in May I ordered a new mobile phone.  As much as I liked my old iPhone 3G, it was getting to be a little long in the tooth.  It worked, but was a bit on the goofy side when I tried to do certain things.  Plus, there was a bit of screw you AT&T for trying to buy T-Mobile in my choice.  I ended up with the Nexus S.  While I personally am new to the Android world, my other half has been in the Android world for almost 2 years.  She has the G1.  Now that I’ve had the phone for a couple of months, I wanted to document some of my experiences.

First and foremost, one of the best features of the phone is the built in Wi-Fi hotspot.  Now, to be fair, the newer iPhones do have it built in as well, but since I’m on T-Mobile, the additional feature costs me nothing extra.  And it’s legit.  No jail breaking, no side loading, nothing.  Just a built in feature of the phone.  In fact, I’m at a coffee shop here in DC right now watching over my child process (also known as Alex) and I didn’t even bother getting a code for the Wi-Fi here.  Instead, I’ve conducted some basic business over the portable hotspot on my Nexus S.

And while it’s handy, one thing I’ve found rather astonishing was the fact that during a recent trip to Philadelphia, my Nexus S was better for internet connectivity than the major name hotel I was staying at.  I was even able to watch Netflix streaming on my iPad over the personal hotspot while I couldn’t over the hotel provided Wi-Fi.  Now, there was also a wired connection and I wonder if that would have worked better, but since my iPad lacks an ethernet jack …

Other surprising things for me included:

  • Speed: even though the Nexus S is only on 3G and not T-Mobile’s HSDPA+/4G, using the phone itself is quite snappy.  And compared to the EDGE connectivity I was getting on my old iPhone, this guy’s a downright speed demon
  • Usability: if you listen to the Apple faithful, the Android phones are a pale comparison of the iPhone.  I guess in some ways they are correct.  The iOS devices are definitely slick and I do still enjoy my iPad (though it’s become a bit flaky lately).  Now, that being said, I do like my Nexus S.  It works and generally works well.  Once over the hump, I do think it works as well for me as the iPhone did.  There are more rough edges, but it’s not the night and day difference some think.
  • Auto Updating: it may be a simple thing, but knowing that my phone will just be constantly up to date is a great thing.  There’s no hooking it up to the computer to run updates, it just does it.  And even better is if you allow it, all your applications will automatically update themselves too.  Maybe that’s less of an issue for me, but for others (like my other half), having it keep itself up to date is a great thing.
  • Music: the music stack is not as nice as the iOS versions for sure.  However, I like the choices I have.  If I don’t like the Google-provided apps, I can always install something else.  In fact, one of the best things I have is access to both the Amazon MP3 music store and Cloud Music Service, but also the Google Music service.  They are a single source for storing my music and then syncing to my phone.  It saves me from having to decide exactly which music I want to download and have.
  • Apps: now, any Android phone can share the same apps as my phone, however there are a few exceptions.  One big surprise for me was that any restricted applications, such as Netflix, always target the Google phones first.  That means that when Netflix shipped their streaming app for Android, it supported two phones: the Nexus One and the Nexus S.  I sort of assumed that since the phone wasn’t that popular, I would get those last.  I guess that targeting the pure Google phones is a better idea than I thought.

There are definitely some things I miss and some of the applications aren’t as good on the Android as on iOS, but overall, I’m quite happy with my choice.

Wednesday, May 04, 2011

Next Redbook is out

Last fall I spent 3 weeks in Costa Mesa helping write the updated IBM FileNet P8 Platform and Architecture Redbook.  It’s finally done and posted (or more specifically it was posted about 3 weeks ago and I missed it.  Here’s a link:

http://www.redbooks.ibm.com/Redbooks.nsf/RedbookAbstracts/sg247667.html?Open

Sunday, May 01, 2011

Switching mobile phones

I finally broke down and switched mobile phones this week. I had been using an iPhone 3G (bought new and unlocked at duty free in the Rome airport) on T-Mobile. While not perfect it was a quite serviceable little phone, even though I could only use EDGE speeds for mobile broadband, not the 3G my wife's Android-based G1 used.

For two years, the iPhone was exactly what I had wanted and was a joy to use. Two things happened to make me choose to switch. (well three if geek wanderlust is factored in).

First off, iOS 4.0 came out and many apps started to require it. Because of this, I ended up having to upgrade. Now, for 3Gs and iPhone 4 users, that wasn't t a big issue, but for us 3G users, it wasn't the best upgrade. While it did work, there were a few limitations, most importantly, the speed dropped. iOS just needed a beefed CPU than what my older and aging 3G offered. Now, while annoying, Apple did say at the start, a 3Gs was the minimum for good performance.

While the speed thing was annoying, the second issue that started plaguing me was what appears to be an increasing reliance of mobile data. It seems that more and more applications I used frequently seemed to put a greater and greater strain in the data pipe. What this looked like is more and more apps just seemed to stall for no good reason. On wifi, the problem was less noticeable, so I'm inclined to think they were assuming there was a good 3G connection.

The other reason I chose now to upgrade was the AT&T buyout of T-Mobile. I've been a customer of T-Mobile since 2000, or when they were VoiceStream. I've loved their service and their pricing. And since I live in a big city, coverage was never an issue for me. Well, there was no coverage in my parent's hometown in southern Illinois, but I'm nit there too often, so it wasn't a real issue. Now, with AT&T buying them, I can only assume that means my bill will go up, and up, and up. (hooray for no competition). Upgrading phones now means I signed up for a two year contract extension which means that when our clueless government approves the deal (which it will, even though it's bad for everyone but AT&T), I will still be under contract and they have said they will honor all existing contracts.

Oh, and my current plan is unlimited everything (phone, data and SMS) for both Felicia and myself, for about the same price as everyone else's non unlimited plan. Take that death star...um...AT&T.

I've really only started playing with my new phone, a Samsung Nexus S. So far I really like it. Switching is a strange experience, one I will document more as time goes on.

- Posted using BlogPress from my iPad

Friday, April 15, 2011

New michaelandfelicia.com

For those of you who missed my last post, I decided to take the old michaelandfelicia.com and convert it from a long disused home for our pictures and make it more of a life stream or aggregation of all our other social networking sites and blogs. What made this an interesting project was that I was able to pull out of mothballs my old FileNet WCM / eGrail virtual machine. It ended up being a decent vehicle for doing this kind of project. After a few weeks of messing around, the new site is now officially launched. It's still a work in progress in that it only tracks Twitter (and anything we cross post there) and our blogs. I hope to add more sites as I have a chance to get my RSS parser built for other sites. But in the meantime, this should track us relatively well. - Posted using BlogPress from my iPad

Monday, March 28, 2011

An old friend

In an ongoing effort to keep both friends and people who don't really care up to date with what I and the family have been up to lately, I decided to take my wife and my website, michaelandfelicia.com, and turn it from a place were four year old content went to die and make it a one stop shop for following us and our life. Frankly, I would have liked to have used FriendFeed with a custom URL for this, but since it's been bought by Facebook and seems to have been left out to pasture, I decided that a roll your own solution may be just the ticket. Though to be fair, roll your own is a bit of an overstatement.

So, to that end, I dug out and brushed off an old friend, FileNET WCM 5.1, or more specifically eGrail. It may have needed a bit of dusting and removal of a small amount of bit rot, but once it got it back up and running, it felt and still feels like home. I do have to admit, that it has taken me a bit of time to remember all the old tricks, but even after almost 10 years, the software seems as good as ever.

I often wonder what would have happened if FileNET would have kept working on it ....

Anyway, I decided to take the old syndication module out for a spin and so far it seems to be working just great. I already have it pulling in feeds from Twitter and Blogger. Not sure the other feeds I'll pull in, but if it has RSS, then it's fair game as far as I'm concerned.

So far the only real issue I've seen is that WCM seems to expect a single article per XML file. That meant I have to preprocess the feeds using Java, but I had to do something to download the data anyway, so it does seem l be working out. As soon as it's ready, I'll post a link.

- Posted using BlogPress from my iPad

Saturday, March 12, 2011

Infrastructure

I hate it. I really hate it, but after seven years with Speakeasy as my primary Internet provided, I ended up having to switch to Comcast. There was just something increasingly wrong with my old POTS-based DSL line. When I first moved into the new house, I was getting a solid and very reliable 5.1 or 5.2 Mbps. While not the best, it was sufficient for my home office and enough to reliably do Netflix streaming in HD.

But it was definitely expensive. I was spending about $105 per month for the service, which was a bit high. However, it was business class and was running on a dedicated loop, meaning it wasn't sharing the line with my current telephone service.

However, after. Year and a half of service, something went wrong and the speed and reliability began to drop. Toward the end, about the best I could reliably achieve was about 4.0-4.2. I'd lost about 20% of my performance. And that drop was just enough that I could no longer stream Netflix at HD. In addition, with Felicia quitting her job to take care of Alex, we had additional demands on the service.

So, in the recommendation of my brother in law, I looked into Comcast Business service. I ended up with the cheapest package for both Internet (10x2) and basic cable television as well as a telephone line. The true triple play as it were. Now, I did end up opting for the static IP address at $14/month, I think.

All of this together, even with the static IP, ended up being $50 per month less than what I was paying for my DSL line and telephone before. And just as important, it's been reliable so far. I've been with them for going on three weeks now and I can't really complain so far.

As time goes on, I'll try to report back if I see any throttling or bad traffic shaping.

Sunday, September 19, 2010

Mindy 3.0 – Open the Windows!

So, in what appears to be an annual event, I’ve had to rebuild Mindy yet again.  This time it wasn’t a complete hardware failure that caused the refresh, though there was a minor hardware problem involved.  Turns out when I rebuilt her last year, I had picked up three Western Digital WD1001FAYS 1Tb hard drives to uses as part of the RAID configuration.  Unfortunately, those are desktop drives and appear to be basically incompatible with any kind of RAID for redundancy (RAID-1, RAID-5, etc).  I kept seeing disk timeout errors from my RAID controller which were causing system stalls (up to 60 seconds) and repeated verification runs on the array.  Besides being annoying (nothing like having the music you’re listening to just stop mid stream followed by a very annoying repeating click – love the SoundBridge, but it doesn’t handle that gracefully), I also would see file corruption on my primary file server.  Fortunately it was confined to the logs, but it was no good none the less.

The other issue I was seeing was an almost complete failure of the VMware web UI (MUI).  When I first rebuilt her, I installed Ubuntu Linux server 8.04.2 LTS.  My hope was to avoid doing a rebuild for a while and plus I’d worked with Ubuntu for a while and really liked it.  Unfortunately, it appears that Ubuntu + VMware Server 2.x (at least for me) are not really a good combination.  For the first couple of months, all was well, but then the MUI started failing to connect to the back end server.  At this point, I can’t tell if it was a simple degradation of the system or if it was a result of an update (I think the latter).  Regardless, by about 2 months ago, I would restart the UI, connect once or twice and that was it.  I could never reconnect, so no starting VMs, stopping VMs, etc.  It became basically unmanageable.

At that point, I decided it was time to fix the drive problems by replacing the desktop drives with three Western Digital WD2002FYPS RE4 2 tb drives.  These are the RAID Enabled drives, which basically means they have TLER (time limited error recovery) enabled.

Time Limited Error Recovery (TLER) - instructs the drive that in the event of write failure that it should limit the amount of time it takes to attempt to correct the issue.  Typically this is 7 seconds.  That way the RAID controller can catch the issue and help fix it rather than assuming the drive has timed out.

No only do they have the TLER, they also have a 5 year warranty and 64mb cache, so they are blazingly fast.  More on that in a moment.

So, about $800US later and I have three new hard disks and the battery backup (BBU) for my LSI / 3ware 9650SE RAID card.  (I’m thinking this is worse than a child, but I digress).  I proceed to backup all the data (no small feat with 3tb online and only 1.5 tb of space around), and start the rebuild.  First thing I do is take out the six hard disks I had in the system before (3x1Tb and 3x500Gb), attach the BBU and install the three new RE4 drives.  Then I had to determine which OS to use for the new system.

My original plan was to move to VMware ESXi 4.1.0 so I could continue to leverage my investment in VMware (both as a desktop system and from the server).  So that was my first move.  I build the array and then tried installing ESXi.  No dice.  It would load the bootstrap and then just before starting the configuration UI, it would freeze (at the “Booting: MBI=0x01100db, entry=0x00100212” step).  I couldn’t seem to figure out what was happening.  Nothing online seemed to explain the error and I had a hard time determining if my motherboard (ASUS P6T Deluxe V2) was supported or not.  VMware does have a small number of supported motherboards and the ASUS was definitely not on there.  I had found a site that said the P6T (no additional markings) has been made to work, so I didn’t assume it was a complete hardware incompatibility.

I was stumped, so I turned to Windows Server 2008 R2.  I have an MSDN account, so I pulled down the server installer, just to see if it worked.  Funny enough, it failed too.  The installer would start and then almost immediately give me a black screen and an error code (—insert code here—when I find it).  I dug around and was finally able to determine that Windows (and ESXi for that matter) both failed to start because I had disabled ACPI in the BIOS (remnants of me trying to solve the drive timeout problem mentioned earlier).  Enabling that allowed both to boot.

Now I had a decision to make.  Do I go with ESXi or Hyper-V.  My preference was definitely VMware’s solution as I’ve used Server and Workstation for quite some time and generally have good luck with it.  In fact, I use Workstation on my work laptop on a regular basis.  So it was back to trying to get VMware to boot.  First thing I discovered is that VMware ESXi (not ESX) cannot be installed to a RAID card like mine.  It needs some sort of storage (non-RAID).  So I ordered a compact flash card to SATA adapter.  Once that came in, I tried it again.  Still no dice.  Now I got an error saying the vmfs3 module wouldn’t load.  Back to Google.

After a bit of searching, I think I’ve determined that it isn’t compatible with my network card, and apparently this is a requirement for it to install.  I thought about ordering a new network card, but at this point, I was tired of spending money.  And while the CF to SATA card was being delivered, I tried Windows 2008 and seemed to finally get it to a point where it looked usable.  So, I ended up installing Windows Server 2008 R2 and Hyper-V.  (Though, I probably should have gone with the Hyper-V stand alone, free product in retrospect … and if I had known about it, I would have).

All wasn’t completely rosy, but I’d made my decision.  There were, however, a couple of key tweaks I had to perform before the system worked as it should.  I ended up having to disable the IPv4 large packet offload on all the systems before I managed to get proper network performance.  Actually, on my Server 2008 client OS install, I just disabled all offload.  Now, I can get around 40-60Mb/sec transfer rates to and from the VM.  That’s what I like to see and wasn’t ever quite seeing with the older VMware/Linux setup (though, I didn’t have the BBU for my card and I think the new drives have twice the cache, so that’s not necessarily an apples-to-apples comparison)

No matter, I now have Mindy 3.0 up and running and she’s doing quite well.  I’ve even used the Vmdk2Vhd converter to move from VMware to Hyper-V.  Though, one piece of advice for the transition: before moving any Windows VMs to Hyper-V, install an IDE hard disk, boot the VM and uninstall the VMware tools.  That’ll make sure you can (a) boot the Windows VM on Hyper-V (as it only offers IDE disks for conversions like this) and (b) you actually get the VMware tools uninstalled.  They don’t like being uninstalled off a non-VMware hosed system.

Monday, March 15, 2010

HD HomeRun

Was checking my newsfeeds on Sunday and happened to see that Newegg was selling a Silicon Dust HD HomeRun networked digital TV tuner for $75 (or $25 off the regular price).  I’ve been thinking of getting one of these for quite some time, so … what the heck, right?

For those of you not familiar, the HD HomeRun is just a simple TV tuner that connects to your home network.  Then, with the right software or hardware, it’ll let you watch live TV on your computer (or your TV …. wait …)  In addition, you can get software that’ll work like a DVR (or TiVo).  I’m hoping to use this to record live TV and then have it automatically converted to AppleTV format so I can watch if whenever I want to.

Ok, I could have gotten a TiVo and done the same thing, but with this solution, I get to keep the files and copy them to whatever device I want, like my AppleTV or an iPod.  Plus, I’m a nerd, so getting something like this to function and do what I really want it to do is a challenge and fun to boot.@

Sunday, January 03, 2010

Apple’s Rumored Tablet … My Take

For at least the past 5 years (and maybe more, I’ve kinda lost track), there’s been speculation that Apple will release a tablet-based computing device.  Each year, the pundits and everyone else take their time to pontificate and or pass on the latest inside information saying that the tablet is nigh.

This last year was no exception.  Everywhere I looked, people were thinking it was coming soon or going to be at least announced at the next major Apple event.  Personally, I’ve not taken much stock in the rumors because they didn’t seem credible, and from my way of thinking (based mostly on what other people were saying), there was no real reason for Apple to release one.  And up to this point, I’ve been right.  Apple has yet to release a tablet (other than the iPod Touch which, while very nice, isn’t a “tablet”)

However, the Wall Street Journal has started talking about a tablet computer, so for this reason, the tablet appears to be a bit more real than it did six-to-eight months ago.  For that reason, and a friend who also got me thinking about what a tablet could be, I’ve started thinking about what it could be.  Before I say anything else, this is just pure conjecture and a fair amount of “wag”-ging (wild-ass guessing) and I don’t have any kind of inside information or sources.  In fact, I’m pretty sure the world doesn’t really care what I say, but on the off-hand chance what I say is correct, I at least wanted it on public record.

I tend to think that every Apple device they release has to have two things going for it:

  • A simple, elevator pitch to explain the device
  • A place in Apple’s overall story

Before I go forward, I do have to point out that I didn’t come up with these requirements.  Rather, I borrowed them from Chicago Sun-Times columnist and regular on TWiT’s MacBreak Weekly, Andy Ihnatko.

So, what do I mean by the above items?

Every Apple device they currently sell (especially on the consumer side) can be summed up in one sentence.  For example, the original iPod was “1000 songs in your pocket”.  The MacBook Air was the “lightest Macintosh ever”.  This is the elevator pitch.

Secondly, the device must fit into the Apple story.  Apple wants to be the center of your digital life.  The iPod offered a way to handle your music.  The iPhone offers the whole internet in your pocket as well.  Even the Macintosh fits in with iLife (iTunes, iMovie and iPhoto), handling your music, home movies and photos.

What would an Apple tablet bring to the table?

I tend to think it’ll be something of an extension of your digital life.  And I think it’ll do it through touch-ready versions of the iLife suite.

What I envision is a netbook sized device that can carry all your pictures, movies and music with you wherever you go.  And it will work either as a stand alone device or as an addition to your current Macintosh.  Going on a trip, take your music, movies and TV shows to watch.  Oh, and if you take pictures or video on the trip, download them directly via the built-in SDHC card reader (or via USB).  You can even do photo manipulation via iPhoto touch and quick video editing with iMovie touch.

Then when you get back home, you’ll be able to sync back up with your home Macintosh’s iPhoto and iMovie libraries to continue editing, or even move the videos and pictures into the professional apps, Aperture and Final Cut Pro.

I also suspect that in addition to WiFi-based wireless, it will also integrate well with your iPhone.  Need to check e-mail on the run, you can do it on your iPhone or use the integration to check it on your tablet.  I don’t expect it to have or offer built-in wireless.

The one thing I have heard recently that makes me question this a bit, is the talk of electronic magazine publishing via the tablet.  It does make some sense, but I’m still inclined away from this being primarily a static media-based device (like a magazine / newspaper reader).  Maybe it’s an addition, but I can’t imagine someone shelling out $600 or more for something like this.

I could be wrong, though.

Friday, June 26, 2009

Security Snake Oil: The Bogus Email Address

This is the first post in what I think may be a series of security snake oil e-mails and ideas that are forwarded my way. However, before reading this, know that the best way to handle your computer security is outlined here;

Just received this forward on how to "protect" your e-mail address book from worms:

How to protect your e-mail address book:

A computer repairman says this is like having gold. This is a good thing. I learned a computer trick today that's really ingenious in its simplicity.

First dead giveaway that this isn't real is that the fix is "simple". There's no panacea for computer security.

As you may know, when/if a worm virus gets into your computer it heads straight for your email address book, and sends itself to everyone in there, thus infecting all your friends and associates.

This trick won't keep the virus from getting into your computer, but it will stop it from using your address book to spread further, and it will alert you to the fact that the worm has gotten into your system.

Here's what you do:

First, open your address book and click on 'new contact,' just as you would do if you were adding a new friend to your list of email addresses. In the window where you would type your friend's first name, type in ' A'.

For the screen name or email address, type AAAAAAA@AAA.AAA

Now, here's what you've done and why it works:

The 'name 'A' will be placed at the top of your address book as entry #1..

First problem is that may not be the case. Just because you see it first does not mean that it'll be the first e-mail stored on the drive. Data is stored in the way that the computer can access it the quickest or perhaps as you've added it. Then when the computer goes to show you the details, it will sort that data into a human readable format. However for the worm, it'll get it in the quickest order the computer can hand it back.

This will be where the worm will start in an effort to send itself to all your friends.

When it tries to send itself to AAAAAAA@AAA.AAA, it will be undeliverable because of the phony email address you entered. If the first attempt fails (which it will because of the phony address), the worm goes no further and none of your friends will be infected.

Second place this breaks down is assuming the worm will stop on error. It won't for 2 reasons:

  1. Internet mail delivery doesn't work this way. It takes the e-mail and then tells the app it's received it. The app then moves on while the mail delivery system tries to deliver the message.
  2. No worm writer would stop on a bad e-mail address. Even if it did get immediate failure, it would just skip to the next address and keep going.

Here's the second great advantage of this method: If an email cannot be delivered, you will be notified of this in your In Box almost immediately. Hence, if you ever get an email telling you that an email addressed to AAAAAAA@AAA..AAA could not be delivered, you know right away that you have the worm virus in your system. You can then take steps to get rid of it!

This is the only valid point in this article. Having a bad e-mail address in your address book would guarantee a failure you may catch. However, it doesn't matter much as your computer already has a problem. And this problem could be stealing your identity, invading your privacy and generally causing problems. The best bet is not to try and catch the problem to fix it, but to keep it from happening in the first place.