Thoughtful, detailed coverage of the Mac, iPhone, and iPad, plus the best-selling Take Control ebooks.

 

 

Pick an apple! 
 
Track Bandwidth Use with VirusBarrier X6

If you have monthly bandwidth limits, it's useful to know how much data you have used so far to avoid overage charges or being shut off by your ISP. But how do you determine how much data your Mac has transferred? Open VirusBarrier X6's Traffic window to see both current and cumulative traffic, and you can also create warnings when your data usage - incoming, outgoing, or both - exceeds an amount you specify.

Visit Intego

 

 

Related Articles

 

 

Come Together: Document Collaboration, Part 1

Send Article to a Friend

One of the great features of the Internet, according to the pundits, is how wonderfully it enables communication and collaboration between widely separated individuals. Much of this communication occurs via email or, in some cases, via instant messaging, and works well. But I'll bet that if you've ever tried to collaborate with a group of people on the Internet to write and edit a document, you've found that moving the document back and forth across the Internet is merely a minor aspect of the entire process, and that effectively collaborating with people is quite difficult.

Here at TidBITS, we spend a lot of time together editing documents - even though none of us share a physical office - and we've worked out a system that has proven quite efficient for us. But since we all write for a variety of other publications and organizations, we've also come across numerous other collaboration approaches. It's continually astonishing to us just how hard it is to put together a good document collaboration strategy and how different groups have chosen to handle the problem.

This week I'll look at some of the major variables involved in any document collaboration system and give you a detailed look at exactly how we share documents within TidBITS. In the next installment, I'll explore a number of other systems I've used with other groups and talk about how well each worked. By the end, I hope you'll have a starting point from which to make your next document collaboration task fast and efficient.

Collaboration Design Variables -- There are several variables to consider whenever you're trying to set up a document collaboration process, especially one which doesn't rely on fancy tools that can be expensive, hard to deploy, and tricky to use. My assumption here is that you have a document you wish to have reviewed and edited by others; although documents may have multiple "owners" throughout a publication process, at any one time there should be only one person ultimately responsible for it.

  • Version control. Perhaps the most important consideration is how you'll handle version control, or making sure that everyone's comments and changes are integrated properly into the final document. There are two approaches here, scattershot and round-robin. In a scattershot approach, you send the document out to all the reviewers, then collect and combine those reviewer's comments. It's easy, requires little technology, and doesn't make demands on the reviewers. Unfortunately, it also often results in duplicate comments from reviewers (who also don't benefit from seeing each other's comments) and gives you a lot of extra work in evaluating and integrating all the different responses. That extra work can make for a better document in the end, though, since you maintain a coherent view of the whole. In the round-robin approach, each reviewer edits the document in turn, which lets them comment on one another's changes and eliminates some extra integration work. Unfortunately, you generally still have to do major cleanup on the final document. Worse, the round-robin approach takes time, and the amount of time goes up with the number of reviewers, since only one person can have the document at a time.

  • Number and type of reviewers. If you're passing a document to one other person and receiving it back, it shouldn't be hard to agree on an approach. However, as the number of reviewers increases, it's often best to choose one of three approaches: simple, rigid, or assisted. A very simple approach that requires little of the reviewers works best with ad-hoc reviewers, though it increases the work of the person in charge of the final document. A detailed approach with rigid rules about markup styles for comments and changes works better when you have a close-knit group that can agree on a process. Finally, if you simply can't get your reviewers to follow a process, assistance from document management tools may become necessary. I can't comment on these for the most part, since they tend to be large, complex, and expensive - three adjectives we at TidBITS avoid like the plague.

  • Document location. The resources available for document location and access - both for you and for your reviewers - play into how you set up your collaboration process. Centralized servers can work well, especially for a round-robin approach with a number of trained reviewers, but they're often hard to use and overkill for a couple of ad-hoc reviewers. In that case, a decentralized solution (usually using email, floppies, or CD-Rs to distribute files) usually proves more effective.

  • Document format. The document format you choose to work in can affect your collaboration process significantly, but it's important to remember that you can use different formats at different points in a publication process. For instance, even if an article is destined for QuarkXPress, that doesn't mean you can't design the review process to use Microsoft Word documents or even straight text files. Plus, programs like Word offer (occasionally inscrutable) collaboration tools of their own, whereas other formats may require you to decide on some simple markup to indicate additions, changes, deletions, and comments.

  • Document markup. No matter what format you choose (and different projects may call for different formats), it's a good idea to negotiate markup conventions ahead of time. Otherwise you'll waste a lot of time trying to figure out different markup approaches, particularly in situations where you'll be passing many documents by a number of reviewers. Even in Word - which has revision tracking and a comment feature that identifies the author of each change or comment - it's a good idea to agree on usage conventions ahead of time.

Keep in mind that no one collaboration strategy will fit every situation, and you may need to come up with several strategies for different groups.

TidBITS Collaboration -- We've adopted a specific strategy for document collaboration that works extremely well for us. It's not perfect, and it undoubtedly wouldn't work in all situations, but let me explain it so you can get a sense of how you might go about using parts of it in your own collaboration process.

We rely on a round-robin editing approach among a small number of editors working on a centralized server, accessible via both FTP and AppleShare over IP. All of our documents are in Nisus Writer, which provides styles for markup, although no revision tracking or comment features like Word's. Here's how a document might go through the entire process to end up in a TidBITS issue.

  1. One of us creates the draft document, applying the styles necessary for our issue creation and distribution macros. That person also does an initial edit pass. We don't worry about small changes, but we mark meaningful changes with colors so others can see what was done. Additions or modifications appear in green, and proposed deletions appear in red. If we need to make a comment or query about a paragraph, we make the comment in red on a line by itself below the paragraph to avoid confusing the meaning of the paragraph with intratextual comments. All comments are signed with the initials of the person making the comment. If the comment or query applies only to a small bit of text, we mark that text in blue. Comments and text for deletion are both red because all red text is automatically removed just before distribution.

  2. After finishing the initial edit pass, the document goes into a folder called IN on our server. Appended to its name are a version number and a set of initials. So this article is currently called Collaboration.1.ace, which indicates it has undergone one edit pass by me.

  3. Let's say Jeff Carlson wants to take the next edit pass. Anything in IN is fair game, so he checks the document out by moving it (a Finder drag via AppleShare over IP, but possible in Interarchy via FTP with the Rename/Move command) to another folder called OUT. He also adds his initials to the filename so anyone trying to figure out who owns the document can tell by the fact that it's now called Collaboration.1.ace.jlc. Checking out the document also involves downloading for Jeff and Geoff, while I, since I'm on the same network as the server, just open it directly, although I otherwise follow the same rules. While Jeff has the document out, he can make any changes he wants, using the same colored markup scheme and comment approach outlined in the first step.

  4. As Managing Editor, Jeff often has the task of sending an article back to the original author, if it wasn't written by one of the TidBITS staff members. Here's where our system falls down a bit. When copying text from Nisus Writer to any other program (such as Eudora), color information is lost (we also prefix all comments with three asterisks so they stand out even without colors). Interestingly, a similar color pasting problem afflicts Microsoft Word as well, so we can't export from Nisus Writer into Word as a workaround. And since relatively few people use Nisus Writer, sending the original file isn't generally a useful option. Thus, our preferred approach is to send the article back in the body of an email message and ask the author to make comments and offer suggested changes just as though they were replying to a normal email message. Jeff then incorporates the changes manually. That works well as long as the changes are relatively minimal, but for more significant rewrites we find that we just have to give the original file back to the author, let him or her make the necessary changes, and then restart the entire process. When Jeff''s done, he uploads the file to IN again, and changes the name to Collaboration.2.ace.jlc so we know it's gone through the second edit pass and who did it.

  5. At any time during this process, we may send the article draft out to expert friends for quick technical review. They too get the article in the body of an email message and make comments and corrections in their reply. Whoever sent it out for review then has to incorporate the corrections and address the comments in the file, checking it out and back in again as necessary. At times, the list of initials at the end of the filename gets too long, at which point we delete some from the beginning of the list.

  6. When he's ready, Jeff moves the current version of the article into OUT, copies all the text, complete with colors and comments, and pastes it into what we call a "copy file," which is the working draft of the full issue. Once the copy file holds the latest versions of all the articles, it too goes into IN and follows the same process rules, although we only add initials to the filename when it has been moved into OUT, since that's happening on Monday when we need to know exactly who has the file open at all times. On Mondays we also tend to use email and the phone fairly heavily to let others know when the document moves to avoid wasting time in between manual checks of IN and OUT.

  7. Throughout all of this, colors and comments stay intact. At the last step before actual distribution, Geoff Duncan gives the issue the final read-through (often out loud - hopefully his neighbors don't mind) and deletes all the comments and text marked for deletion.

For the most part, our system relies on simple technology - an AppleShare/FTP server and colored text in a word processor. We intentionally try to keep our markup rules simple so there's never any confusion internally about what's happened. And when we bring in reviewers via email who don't know our approach, either they don't need to know it, or we can explain it easily.

However, our system also relies on having a small number of technically savvy reviewers with excellent attention to detail. Our approach would fall apart if anyone was overly sloppy or failed to follow the rules, especially those relating to checking files in and out, since we could end up overwriting someone else's changes. But there are other approaches that work better in such situations, and in the next installment, I'll look at some that I've used with varying degrees of success.

 

READERS LIKE YOU! Support TidBITS by becoming a member today!
Check out the perks at <http://tidbits.com/member_benefits.html>
Special thanks to King Lowe, Franklin Bergman, Andrew Burrows, and
John Craig for their generous support!