coding comparison on specific lines within a source

Welcome Forums Forums Getting Help with Nvivo – Scroll to end to post a question coding comparison on specific lines within a source

  • This topic is empty.
Viewing 2 posts - 1 through 2 (of 2 total)
  • Author
    Posts
  • #2600
    kathleenoc
    Member

    Hello,

    I would like to do a coding comparison for coding that a colleague and I have done in one source document. The document was imported from excel, so it has many rows. The issue is that the two of us have coded different rows within the document, with some overlap. We would like to run a coding comparison query ONLY on the rows that we have both coded. Is this possible?

    Thank you in advance for your help!

    #2944

    The bad news is you can't split up a source. Nevertheless, you have several options. One option is visually, using the coding stripes filtered by user and showing sub-stripes so you can see not only which rows you both coded but which nodes you coded to.

    Another option is to use the coding comparison query limiting it to the source and the nodes you coded to. You can export the results to Excel then delete the rows that are coded by one or other and not both leaving you with the view you want. 

    Finally, you could achieve your desired result using a matrix coding query but this is little trickier to do. I'm not sure how competent you are with NVivo but I'm assuming in this response that you know what a matrix is. If you don't, you may need to get back to me for more help. 

    1. Start a matrix query
    2. Insert the codes you worked on into the column
    3. In the rows change the default option from 'selected items' to 'selected users'
    4. Don't forget to 'add to list' as everyone gets caught on that one
    5. Choose run and the matrix will be created intersecting the coders with the nodes and producing a column for each coder which will show the number of references coded for that node and you can drill down see the coded content. 

    By using all or a combination of these three options, you should be able to discern levels of rater agreement  with accuracy and without too much trouble. 

    Hope this helps,

    Kind regards,

Viewing 2 posts - 1 through 2 (of 2 total)
  • You must be logged in to reply to this topic.