jump to navigation

Don’t use NPM install -g inside Vagrant October 20, 2015

Posted by willhlaw in Development.
Tags: , , ,
add a comment

Using Vagrant for your node.js projects makes sense if you want an environment that can be reliably and repeatedly built by any one of your team members and new ones without going through the hassle of writing Getting Started instructions for getting elasticsearch installed for Windows, Mac, and other *nix. I highly recommend it…or it’s up and coming predecessor, Otto (which actually uses Vagrant under the hood, but hides a lot of the complexity and uses best practices).

However, you cannot have the same start up process each and every time if you install dependencies without pinning or locking in the specific version. So rather than running:

>npm install -g grunt-cli@0.1.13

Instead, install grunt-cli locally as a dev-dependency and the `grunt` command will be available as npm scripts in your package.json’s script {} block.

>npm install grunt-cli@01.1.13 save-dep
>npm shrinkwrap --dev

We capture the version npm grabbed and stored in package.json under the devDependencies {} block by running `npm shrinkwrap` with the ‘dev’ flag. This locks your entire (local) dependency tree from your package.json, and creates an npm-shrinkwrap.json file. Npm-shrinkwrap.json files are not used for global installs. So next time your team member starts work on your project and runs `vagrant up` and then `npm install`, the locked dependencies will be read from npm-shrinkwrap.json (instead ofpackage.json) and your team member’s node_modules/ directory will behave just like yours.

npm-shrinkwrap.json files are not used for global installs — npm issues

So how then exactly do we run `grunt` from the command line since we don’t install it globally?

In package.json, we add some scripts:

{
  "name": "myProject",
  "version": "0.0.1",
  "devDependency": {
    "grunt-cli": "^0.1.13"
  },
  "scripts": {
    "grunt": "grunt",
    "test": "grunt test"
  }
}

And from the command line or another script, you run it with:

>npm run test  ## executes `grunt test`
>npm run grunt -- --help  ## executes `grunt --help`

The two extra dashes is npm’s way of allowing you to pass any grunt parameters after it.

To conclude, use npm-shrinkwrap within Vagrant for repeatable and reliable npm environment for your node.js project and install global packages locally and call them using `npm run <scripts.name>`.

Mobile App and Website Testing Roundup for 2015 April 5, 2015

Posted by willhlaw in Development, Mobile, Testing.
Tags: , , , , , , ,
3 comments

After researching testing for a startup, I thought I would share my findings.

Background on Testing

There are a few concerns that testing attempts to solve that is applicable to most mobile app product deployments.

  • There is functional testing (is anything broken?).
  • There is load testing (does app or website fold under pressure?).
  • And there is usability testing (do paid testers acting like users find the app or website easy to use?).
  • Then, there are the targets. Need to consider performing these tests against all of the combination of workflows, iOS and Android and the backend server and database as well as any websites.
  • iOS and Android app testing is different, see why
  • List of Testing Tools for mobile and others

Appium (site, review) – Free, required for services like AppThwack

Free, open source, raw foundation to create test scripts that can be written outside of the project’s code base. Has improved much over the past two years to become the tool of choice among QA shops. I chose this over other testing frameworks such as MonkeyTalk and  Robotium.

  • Free
  • Requires coding knowledge and uses black box testing Selenium style
  • It would be ideal if the development team uses this framework or one of the aforementioned, because other testing websites like AppThwack expect to run these test scripts.

AppThwack (site, review) – Cheap, requires test scripts like Appium

If you have test scripts running for your project, like Appium or others, then AppThwack can automate the running of those tests on 100s of real devices.

  • $20/month for 200 device test minutes up to $500/year for 7,500 device minutes
  • Requires suite of test scripts to be already written

TestElf (site, review) – Cheap

Covers functional testing and has 2 day turn-around

  • $50 signup offer, $200 for 1 test, $1,000 for 6 tests, $2,000 per month

UserTesting (site, review) – Variety

Shows videos of test users using the app or website

  • $49/video or $3,000/year

Offers a free version called Peek, but the app needs to be in the app store.

  • Free

Applause (site, review) – Expensive

Following estimates (see and modify actual quote) are for 4 Apps (iOS App and Android count separately, and there is Consumer and Merchant app):

  • 4 Apps Functional Testing costs $4,500 – $7,499*
    • *With annual subscription, and salesman said these autoquote numbers are high and can come down
    • Custom team of testers and allows up to 10 test case hours
  • 4 Apps Load Testing costs $18,000 – $30,000*
    • *Salesman said these autoquote numbers are high and can come down
    • Led by a performance engineer and hand creates tests and will create reports and improvement recommendations

Recommendations

  • Try TestElf’s functional testing by paying $50 special offer
  • Try Peek’s usability testing for free
  • Depending on seriousness and budget, start to negotiate with Applause and/or require development team to create a suite of tests to cover each major requirement and workflow

Notes on Agile Product Ownership in a Nutshell March 9, 2015

Posted by willhlaw in Administrivia, Productivity.
Tags: ,
add a comment

Illustration

Above is an illustration from the video ‘Agile Product Ownership in a Nutshell‘ that uses the RSA animation technique. Below are the notes highlighting key points made during the video as well as some other points drawn from additional resources. This post will be a good read for Product Owners, both new and experienced, as well as any team member on an Agile Scrum team that wants to revisit the basic principles and possibly realign their team.

Roles in Scrum
PO – Product Owner carries the vision, says no or yes to customer requests, prioritizes, and responsible for building the right thing.
SM – Scrum Master is the coach, responsible for building it fast and fast feedback cycles with the users.
Other Roles
TL – Technical Lead is responsible for building the thing right, talks closely with customers and other teams, but still encourages self organization.
CPO – Chief Product Owner organizes multiple POs and interdependencies.
Development Manager hires, mentors engineers, creates culture, and knows when to step in and lead discussion on branching strategy or versioning [1]
Key Points
Product backlog becomes Team backlog when working on multiple products (new products, old products, O&M, etc)
Value = knowledge value + customer value.
Knowledge value is gained early to reduce risk. Knowledge user stories are UI mock ups, trade study, spikes, prototypes, etc.
Stories have an estimate for effort & value so priority = value / effort. This should make it easier for PO to prioritize Team backlog.
Velocity goes down overtime due to technical debt, architecture decisions, getting behind on automated testing. It is the team’s job to correct. However, how is this investment effort tracked? This question is asked and a lot of the sources below were found here.
  1. Be transparent by explaining benefits of paying down technical debt so that PO can prioritize. [4]
  2. Have a separate Improvement backlog that is internal that is adds a tax to each sprint (e.g. 10-20% of velocity). [5] [6]
  3. Do not obsess over it, basically just pick one and fix the broken stuff already [7]. Also, consider adding to the definition of done (DoD) that the work is done only if it does not add any technical debt.
Charts [2]
Time vs customer value curve = knowledge value, customer value focus, trim the tail
Time vs delivered stories
– fixed scope
– fixed time
– fixed scope and time -> no, let’s decrease scope (b/c can always extend time and not the other way around)
Reasons for Scrum:
– team motivation (not overworked, pressure from above, lack of input or control of march)
– deliver value in sweet spot of the triple constraints (time, cost, quality) Venn diagram
– more accurate predictions and expectation management
– standard metrics to evaluate team improvement and tech choices
Points measure effort [3]
Effort should translate to time, and is influenced by uncertainty and complexity.
  1. How much effort to get to that building? Answer for runner and cripple is one.
  2. How much effort to get to farther away bldg? Answer for both is two, since it looks twice as far.
  3. How much effort to get to farther away bldg where there is a chasm of lava and a small walkway? Answer for both finally agree that it is a 4, since they will have to be extra careful and may drastically slow their progress.
  4. How much effort to get to close bldg while singing Gangnam Style? Answer for both finally agree that it is still a 1, as the extra complexity doesn’t really have an effect on the effort that causes a slowdown.
[1] Development managers vs scrum masters, Dan Radigan at Atlasian, https://www.atlassian.com/agile/effective-management-across-agile
[2] Agile Product Ownership in a Nutshell, Henrik Kniberg, Youtube video – https://www.youtube.com/watch?v=502ILHjX9EE, Transcription – http://blog.crisp.se/2012/10/25/henrikkniberg/agile-product-ownership-in-a-nutshell
[3] Story Points Are Still About Effort, Mike Cohn at Mountain Goat Software, http://www.mountaingoatsoftware.com/blog/story-points-are-still-about-effort
[4] How to translate “business value” of things that are technically important, Matthias Marschall at Agile Web Development & Operations, http://www.agileweboperations.com/how-to-translate-business-value-of-things-that-are-technically-important
[5] Effective Steps to reduce technical debt: An agile approach, Bastian Buch at Codovation, http://www.codovation.com/2012/06/effective-steps-to-reduce-technical-debt-an-agile-approach/
[6] Scrum Strategy – The Dev Team Improvement Backlog, Professional Scrum Trainer at Scrum Crazy, http://www.scrumcrazy.com/Scrum+Strategy+-+The+Dev+Team+Improvement+Backlog
[7] Do agile right – Delivery – Technical Debt, Atlassian, http://www.scrumcrazy.com/Scrum+Strategy+-+The+Dev+Team+Improvement+Backlog

Famo.us Easter Egg April 2, 2014

Posted by willhlaw in API, Javascript, Mobile.
Tags: ,
1 comment so far

In Steve Newcomb’s FAQ: Tough Questions on Famo.us, he puts an Easter Egg for those patient enough to read to the end. I thought those that wanted to google for a quick translation might find this helpful.  Famo.us is a free and open source JavaScript development framework back by a host of cloud services.

Si vos vere postulo impetro in BETA mox steve@famo.us ad minim veniam. Sciam si vestrae res et faciam te in BETA possim.

April 9 Si ad res, in prima acie, simul te BETA.

— Using Google Translate from Latin to English (link) –>

If you really need to get the Beta as soon as steve@famo.us more information, I come. I am able to rest assured that if your situation and I will make of thee in beta.

April 9 On the real thing, in the front line, at the same time you beet.

Best way to speed up Javascript (specifically jQuery) load times November 12, 2012

Posted by willhlaw in Javascript, jQuery, Web 2.0.
Tags: , , ,
1 comment so far

Image

The obvious way is to use a popular CDN, short for Content Delivery Network, such as Google’s. Dave Ward explains the reasons very well in his article, “3 reasons why you should let Google host jQuery for you” and his three main points are:

  1. Decreased Latency – CDN allows the download to occur from the closest server to the user.
  2. Increased Parallelism – The local web server can be serving up content while the other connection is pulling from Google.
  3. Better Caching – Since so many other sites are using Google’s CDN, the user may already have the jQuery or javascript file in their cache.

However, there is a caveat. What if the connection to Google goes down? Can your site survive without jQuery? It should, if you developed the site with a responsive design and progressive enhancement.

The answer is to fall back to a local copy (code sample below).

  1. Write a script tag that refers to Google’s CDN.
  2. Then in the next script fragment, check to see if the jQuery or some public variable from the javascript file exists.
  3. If the object does not exist, then dynamically write another script to the page that refers to the local file.
<!-- Grab Google CDN's jQuery, with a protocol relative URL; fall back to local if necessary -->
<script src="//ajax.googleapis.com/ajax/libs/jquery/1.5.1/jquery.js"></script>
<script>window.jQuery || document.write('<script src="js/libs/jquery-1.5.1.min.js">\x3C/script>')</script>

I learned this concise technique while watching a Twitter Bootstrap 101 tutorial (kudos to David Cochran for the great videos) that referred to this best practice that appears to have come from the HTML5 Boiler Plate project.

By the way, the numbers (from Pingdom) support Google as being most likely the best CDN to use.

CDN performance numbers


A discussion on rating and finding dishes to improve your restaurant experience September 3, 2012

Posted by willhlaw in Administrivia, API, Food.
Tags: ,
add a comment

Photo from http://www.spain-recipes.com/spanish_tapas.html

 

 

Finding the best dish.

Last night, I was with some friends at a greek tapas restaurant in Washington, DC, Cava Mezze. The menu has several small plate items and everything looked delicious. We had eaten at another Cava restaurant at least once and forgot what we had ordered and what was good. That is when I wished there was an app I could pull up on my phone to see what our friends thought of particular menu items. One did not exist, but we all thought at the table how useful that would be. I had heard of startup companies starting to digitize restaurant menus so that I knew the dishes would soon be available, if not already, as APIs. We then ideated about a dozen ways we could build an app that would be useful for finding recommendations on specific menu items and dishes as well as incentivizing users to rate the food they just ate.

Here is a summary of my morning market research into the possibilities of aggregating menu information or using an app that you can quickly discover how good a particular dish is at a nearby restaurant.

I started at the programmableweb and opened up every single “food” API in a tab that mentioned food data, dish, or menu. I later looked at popular Q&A site Quora.

all menus – Has a sophisticated restaurant and menu api that offers hundreds of thousands of restaurants and you can search menus by city.

All Menus appears to be very developer friendly and even offers an interactive documentation api site. Documentation is also enhanced by Mashery and interestingly, their support contact has a grubhub address. Perhaps because their search results has buttons to online ordering that point to GrubHub.

chownow – Digitizes restaurant menus to offer mobile and Facebook ordering.

Chow Now has an API that is private so you have to contact them for access and documentation.

food genius – A Chicago startup that started out as a “netflix for foodies” but ended up pivoting to deliver data to the restaurant industry. They get their data from other parts of the web, particularly grubhub. Checkout this video where they use their API to find the best curry in Chicago. They are involved with conferences and other food innovators at the food+technology blog site. This is where I learned that there is a lot of activity in this arena.

Food Genius encourages developers to use their API to create consumer apps and to start, they have created a website, foodgenero.us which gather’s people’s tastes on dishes and also donates to Feeding America to end hunger.

Captured from http://foodgenero.us/

Food Genius is very developer friendly and they offer examples in python, node.js, and php. Nice work.

food spotting – This is the app that we were pretty much thinking of last night. It provides a social way to discover a dish that you want and provides a pretty easy way to review dishes. They have even gamified it through a concept called Guides that individuals can create, almost like songlists, and even offer badges for others to win if they “spot” enough of the foods on the guide. I never realized finding, eating, rating food could be loved so much.

Food Spotting’s API is language agnostic, Restful and seemingly developer friendly.

grub hub – GrubHub appears to be the premier website and app for ordering food online. GrubHub allows you to rate a restaurant with a star system but it also shows reviews from Yelp. The ordering for each menu item is detailed and they make it a point to tell you the delivery or pickup availability. GrubHub’s home page allows you to search a location and an optional keyword for restaurant name, food genre, or food type. The search results offer a list view and a map view.

Captured from home page http://www.grubhub.com

Grub Hub has little information on an API and according to the founder’s response on quora, their semi-API (I have no idea what that means) is Private and you have to contact them.

locu – This is an up and coming startup that is focused on offering restaurants easy ways to publish their menus online. I am not sure where they are getting their menu information. More information will be revealed most likely when they fully launch.

Locu has an API that is still in private or in early access mode and it used to be called menuplatform.com because that now redirects to lucu.com

open dining – Offers a platform for restaurants to create a digital menu and ordering system for us on mobiles and Facebook. This is a lot like chownow. Their API is developed and targeted towards ordering apps.

open menu – Openmenu.org and Openmenu.com wants to become the industry standard. There is a simple search (with no map) for a dish around a certain location.

Open Menu has a format specification for the restaurant and for their menus. Their APIs are fully developed, as it is the org’s main goal, and examples are provided only in php.

Captured from home page http://www.singleplatform.com/

single platform – This New York company looks like it has a very strong backing and claims to be the world’s largest provider of menu items. They seem to be concentrating on signing up Washington, DC businesses because on their homepage, they highlighted a local cafe that we love, Lost Dog Cafe. This link points to a w.singlepage.com URL but interestingly enough, Lost Dog Cafe also has a GrubHub menu. I bet that the Lost Dog owners were searching for an online solution and when they checked out SinglePlatform, the singlepage was automatically created.

Singe Platform has a private API that requires registration before you get any documentation so essentially no information is available.

Conclusion

Start using foodspotter and grubhub now. Follow getfoodgenius and locu. Watch the industry as the open and platform menu players grow. Use the apps and APIs and then see where the experience falls short and consider developing an app that fulfills that need. For instance, consider an app with augmented reality that shows the highly rated items as you point your phone’s camera at a restaurant’s menu to help you decide what to order.

Lawnchair: Intro to Javascript library for persistent browser storage September 12, 2011

Posted by willhlaw in Javascript.
add a comment

lawnchair

I was reading http://dailyjs.com and discovered this  javascript library, Lawnchair: Simple JSON Storage, that is very lightweight (it is a lawnchair as opposed to a couch; 3.4k minified; 1.5 gzip’d) and uses a cascading set of techniques or adapters to store objects in memory that persists in the browser even after page refreshes.  Lawnchair is optimized for mobile and HTML5 browses, but defaults to in memory storage for older browsers.

This post is an Intro because I just discovered it. Once I use it,  I will post an article about my findings and title it “Lawnchair: Hands On Summary…”.  Hope this distinction helps.

Top 12 Ways to Being a Mensch (aka – a good professional) September 11, 2011

Posted by willhlaw in Administrivia.
Tags: ,
add a comment

Enchantment by Guy Kawasaki

Paraphrased from, http://www.openforum.com/idea-hub/topics/the-world/article/how-to-be-a-mensch-in-business-guy-kawasaki and taken from the author, Guy Kawasaki’s book, Enchanment: The Art of Changing Hearts, Minds, and Actions.

Mensch is a German word for “human being”, but its Yiddish connotation far exceeds this definition. If you are a mensch, you are honest, fair, kind, and transparent, no matter whom you’re dealing with and who will ever know what you did. Bruna Martinuzzi, author of The Leader as a Mensch: Become the Kind of Person Others Want To Follow, compiled a list of ten ways to achieve menschdom. Here is Guy’s summary of her insights:

  1. Always act with honesty.
  2. Treat people who have wronged you with civility.
  3. Fulfill your unkept promises from the past.
  4. Help someone who can be of absolutely no use to you.
  5. Suspend blame when something goes wrong and ask, “What can we learn?”
  6. Hire people who are as smart as or smarter than you and give them opportunities for growth.
  7. Don’t interrupt people; don’t dismiss their concerns offhand; don’t rush to give advice’ don’t change the subject. Allow people their moment.
  8. Do no harm in anything you undertake.
  9. Don’t be too quick to shoot down others’ ideas.
  10. Share your knowledge, expertise, and best practices with others.

Guy also adds two more ways to achieve menschdom:

  • Focus on goodwill. Mensches focus on goodwill – that is, positive actions that make the world a better place. People distrust those who focus on improving their own position and who denigrate others.
  • Give people the benefit of the doubt. People are good until proven bad, which is only proven after several bad experiences.

I have a looong way to become a Mensch, but I think these are excellent principals to live by and do business by.

FPWeb Hosting Badge July 8, 2010

Posted by willhlaw in Administrivia.
2 comments

fpweb-powered-by_footer_2010

Tom Brauch and the nice guys over at FPWeb.net are hosting the sharejPoint.com website free of charge.  They do so because they feel that non-profit sites like sharejPoint that promote free, open source solutions are a great benefit for the SharePoint community. 

So the guys on the jPoint team have finally updated the master template to include the above banner, and I have also added the banner below to the jPoint blog page. 

fpweb-powered-by_2010_200x200

Which banner do you like better? 

Received Marklogic Innovative Award for SharePoint integration May 10, 2010

Posted by willhlaw in Marklogic, SharePoint, XML Database.
Tags: , , , , , ,
add a comment

Marklogic User Conference 2010 Logo

Congratulations for Innovation Award to my Dev Team!

At the Annual Marklogic User Conference 2010 in San Francisco, my team and I were awarded the Innovation Award. It came as a heavy crystalline pedestal and looks impressive. We were awarded for our SharePoint front-end integration to a Marklogic XML database backend.

In addition:

  • The team has built a model around a generic query.xqy module which acts like a mini app builder widget. Pass it a query and a format type (html-basic, html-table, map, KML, RSS, JSON, etc) and you will get results quickly with a sidebar for a keyword search and the faceted search.
  • We have developed an improved SharePoint Connector for both SharePoint document libraries as well as  SharePoint lists (the Marklogic SharePoint Connector only synchronizes document libraries).  Our connector also allows the user to configure extra paremters such as the collection to post the item to.  If anyone is interested in this Marklogic SharePoint Connector for Lists and Libraries, please let me know.  We are thinking about polishing it up and packing it in a solution file.