Commit Graph

16 Commits

Author SHA1 Message Date
Matthew Honnibal cf412adba8 * Refactoring to use Tokens object 2014-09-10 18:11:13 +02:00
Matthew Honnibal 45a22d6b2c * Docs coming together 2014-08-29 01:59:23 +02:00
Matthew Honnibal fdaf24604a * Basic punct tests updated and passing 2014-08-27 19:38:57 +02:00
Matthew Honnibal e9a62b6eba * Refactoring with Lexeme as a class now compiles. Basic design seems to work 2014-08-27 17:15:39 +02:00
Matthew Honnibal 68bae2fec6 * More refactoring 2014-08-25 16:42:22 +02:00
Matthew Honnibal 88095666dc * Remove Lexeme struct, preparing to rename Word to Lexeme. 2014-08-24 19:24:42 +02:00
Matthew Honnibal 782806df08 * Moving to Word objects in place of the Lexeme struct. 2014-08-22 17:28:23 +02:00
Matthew Honnibal 07ecf5d2f4 * Fixed group_by, removed idea of general attr_of function. 2014-08-22 00:02:37 +02:00
Matthew Honnibal 811b7a6b91 * Struggling with arbitrary attr access... 2014-08-21 23:49:14 +02:00
Matthew Honnibal a78ad4152d * Broken version being refactored for docs 2014-08-20 13:39:39 +02:00
Matthew Honnibal 01469b0888 * Refactor spacy so that chunks return arrays of lexemes, so that there is properly one lexeme per word. 2014-08-18 19:14:00 +02:00
Matthew Honnibal 057c21969b * Refactor for string view features. Working on setting up flags and enums. 2014-07-07 16:58:48 +02:00
Matthew Honnibal f1bcbd4c4e * Reorganized code to accomodate Tokens class. Need string views before group_by and count_by can be done well. 2014-07-07 12:47:21 +02:00
Matthew Honnibal ff1869ff07 * Fixed major efficiency problem, from not quite grokking pass by reference in cython c++ 2014-07-07 07:36:43 +02:00
Matthew Honnibal d5bef02c72 * Reorganized, moving language-independent stuff to spacy. The functions in spacy ask for the dictionaries and split function on input, but the language-specific modules are curried versions that use the globals 2014-07-07 04:21:06 +02:00
Matthew Honnibal 556f6a18ca * Initial commit. Tests passing for punctuation handling. Need contractions, file transport, tokenize function, etc. 2014-07-05 20:51:42 +02:00