Elance “Skills Assessment” Tests — HA!

Sometime in 2000 or 2001, I was asked to create a skills assessment for senior-level relational database management systems experts for a very young Brainbench. I was not alone: the company selected a total of four of us, all given the assignment of coming up with forty-five multiple-choice problems related to the general concepts of RDBMS. It took me about two days to complete the task, and a few weeks later, when the test went into a public beta, folks could take the test and vote on the quality of the questions. The results of the votes were not shown to the public, and we (the consultants who created the questions) were not privy to the voting statistics either. Several weeks after that, the test became official, and I aced it: every one of the forty-five pages were from my packet.

Despite being rather proud of myself for a relatively small accomplishment, I was actually really surprised; the quality of the submissions from the other consultants was all – in my opinion – very, very good. In fact, I felt more as though some of my own paled in comparison. It seemed that all of the other submissions were by folks who really understood the topic thoroughly, and were masters in their field. In fact, I did later learn that I was the only one of the four hired who did not have a computer science degree. Talk about humbling.

Today I decided to take a few skills assessment tests on Elance – a leading online freelance marketplace – on a variety of technical subjects. Included in the ones I took were tests to evaluate one’s comprehension of Linux and Amazon Web Services.

I was disgusted.

The grammar was horrible. The content was filled with fluff and trash. The questions weren’t representative of someone’s working knowledge on the subject — in fact, some even took text from “About Us” sections that described the company, not the service provided. And in the Linux test specifically, there were cases of areas where multiple answers were correct, but only one could be chosen; other times where no answer was technically correct, but a choice had to be selected. The most appalling of it all: questions on obscure, unnecessary things like “Which of the following software packages is used to create a closed-circuit television system?” I had to look that one up after the fact. I didn’t take a test on how to set up a video system, I took a test on Linux skills. I highly doubt the MSCE or MVP tests ask for the steps of motion tweening in Flash.

It was quite obvious that the tests were created by folks with limited knowledge on the subject matter. In fact, it was probably completed – at least in majority – by the lowest bidder, who may very well have been a non-native-English administrative assistant. Hell, nowadays, anyone with Internet access thinks they have the skills and marketability to work as a professional freelancer. Some do…. most – and I really mean MOST – do not. These so-called “skills assessment” tests were proof-positive of that; they’re a joke, and folks serious about testing the skills of others would be ashamed to have them as the representation of their own knowledge on a given subject.

Granted, I can’t speak for all of the tests. There are many available, and on a wide variety of topics. I’m sure that some are much better than others, and that some of those may actually be very good at gauging an individual’s skill on the matter. Now they just need to try to get that same quality across the board.

Because if I can take a test on something of which I admittedly have almost zero knowledge, be more confused by the spelling and sentence structure of almost every single question and option, score a barely-passing 65%, yet still be in the “Top 10%” of all test-takers, something must be wrong.

Share

8 comments

  1. I’m curious… were these test supplied by ExpertRating or were they exclusive to Elance?

  2. Bill K. says:

    I just took their tests for Objective-C and the iOS. I do NOT have a degree in computer science, but I did spend almost a whole year learning about them to work on an app; working through half a dozen books. I failed both tests miserably.

    But most of the time I wondered who in God’s name would KNOW that answer and if it was even important.

  3. Garet C says:

    Recently have been taking eLance skills tests. Sadly, I’ve done fairly well..except where it counts. Some skills which honestly I’d rank myself pretty well, but there are surely more specialized experts out there, I have gained top 5%.

    Other areas which I have extensive knowledge, especially those involving their code testing, are just awful. Trying to run a function inputted in a text box with only “it worked” or “it didn’t work” or MAYBE at least a ‘parser error’ for some languages… well it is vague enough already.

    Big big problems here. About half of the programs I wrote came up saying they spit out the wrong output (which is some poorly defined format usually involving commas) I went ahead and tested them on my own to find they work as intended minus a typo and a couple missed semicolons. By and large, these programs worked just fine right out of the box in GCC, VC6, MinGW or running Apache. I’m mostly referring to C++ and PHP here.

    After all that, the code requested could be made by a sharp 12 year old who’s read enough tutorials. Not even a single class declaration in the whole mess.

    Sigh,…

  4. Jason H says:

    I recently took the WordPress test and failed miserably. I have setup several successful WordPress websites for myself and customers and thought I would do great. There were questions in there that only programmers and server admins would be able to answer.

  5. ANON says:

    Elance (as well as oDesk, I believe) utilizes quote-on-quote skill tests provided by a third-party company called ExpertRating. Like some of the other people who have commented on this post have already noted, many of the tests are poorly designed and rife with errors and inconsistencies. Without a doubt, this makes them totally inaccurate indicators of an individual’s level of expertise in a chosen subject area. The only option test takers are given is to “report” a question that seems faulty when you encounter the question while taking the test. Keep in mind that the tests are timed and that the few moments you take to report a faulty question is time being subtracted from the total amount you have to take the test (which is absurd). Furthermore, there seems to be no rhyme or reason to the way the percentage “rankings” are calculated (oDesk displays a test taker’s score results in the form of a percentage, i.e. 95%, which I personally think is a more accurate indication to an interested client of how well you did on a test, at least). In my experience, I have scored incredibly high on some tests but have still been dropped down to a much lower percentage ranking (i.e. top 20% or all contractors who took the same test, or top 5%, etc.) … which makes little to no sense. The help section of the Elance website claims that each individual’s test results are compiled and compared against everyone else’s test results, and that the percentage rankings are therefore not fixed. But it has seemed to me so far that this isn’t all true. The website’s scoring and ranking systems seem to have plenty of room for improvement. The tests are supposed to give potential clients a reliable idea of a contractor’s expertise in a chosen field. They really don’t.

  6. ANON says:

    Sorry, I made a few of errors: “In my experience, I have scored incredibly high on some tests (missing only one question) but have still been dropped down to a much lower percentage ranking (i.e. top 20% of all contractors who took the same test vs. the top 5%, etc.) … which makes little to no sense when I see that other contractors have ranked much higher. (my issue is that I don’t get how this is even possible when, in the case of missing only one question, for example, there’s nowhere else to go than to get a perfect score! I wonder how well others who took the same test did to get ranked in the top 5% or 1% when it’s a matter of missing one question and getting everything right.) The help section of the Elance website claims that each individual’s test results are compiled and compared against the test results of everyone else who has ever taken the same test, and that the percentage rankings are subject to change as new scores are factored in. But from my observation, this doesn’t seem to be totally true. The website’s scoring and ranking systems seem to have plenty of room for improvement and I have yet to receive a satisfactory response from Customer Support. The tests are supposed to give potential clients a reliable idea of a contractor’s expertise in a chosen field. But in reality, a lot of the tests (and the way Elance chooses to display the test results) really don’t at all.”

    I hope this information might help others who are interested in joining the Elance network become better informed of some of the website’s practices.

  7. Nathan M says:

    I just took the PHP and C# tests and failed miserably.

    In the PHP test, they give you 15 minutes to solve a problem that takes 30 seconds to solve, and then they give you 5 minutes to solve a problem that takes 30 minutes to solve.

    In the C# test, I was blown away by the number of irrelevant questions about the CLR, the GAC and nit-picking questions about a satellite assemblies.

    I’ve worked in .NET for over a decade. Never, in my practical, useful, every day dealings have I ever had concerns about the CLI and the CLR or all the intimate details of Common Intermediate Language.

  8. mark zukerburg says:

    Get 100% results on elance skill tests with our best online services at http://www.kickexam.com and pass the exams with satisfaction. Get ranked at top 1% on every elance exam.

Leave a Reply

Your email address will not be published. Required fields are marked *