Skip to content

Commit

Permalink
Merge pull request #198 from creachadair/indocumentado
Browse files Browse the repository at this point in the history
Add documentation comments to package tokenizer.
  • Loading branch information
bzz authored Jan 29, 2019
2 parents 260dcfe + 5245079 commit fe18dc0
Showing 1 changed file with 6 additions and 0 deletions.
6 changes: 6 additions & 0 deletions internal/tokenizer/tokenize.go
Original file line number Diff line number Diff line change
@@ -1,3 +1,6 @@
// Package tokenizer implements file tokenization used by the enry content
// classifier. This package is an implementation detail of enry and should not
// be imported by other packages.
package tokenizer

import (
Expand All @@ -8,6 +11,9 @@ import (

const byteLimit = 100000

// Tokenize returns language-agnostic lexical tokens from content. The tokens
// returned should match what the Linguist library returns. At most the first
// 100KB of content are tokenized.
func Tokenize(content []byte) []string {
if len(content) > byteLimit {
content = content[:byteLimit]
Expand Down

0 comments on commit fe18dc0

Please sign in to comment.