package baguette_sharp

  1. Overview
  2. Docs

General lexer module : generate lexers char by char or word by word (default)

val read_token : string -> bool -> Token.token_type * int

Take a word and if the lexer is between quotes and returns the corresponding token with Token#string_to_token

val recognized_token : string list
val max_lst : 'a -> 'b list -> 'c

Returns the maximum of a list

val is_a_token_a_keyword : string -> int

Take a string and returns the biggest token matching the keyword

val type_inference_algorithm : string -> Token.token_type

Runs the type inference algorithm

val extract_token : string -> int -> Token.token_type * Token.token_type

Take a string with a token in it and returns a couple of Tokens

val generate_token_with_chars : Stdlib.String.t -> Token.token_type list

The char by char lexer

val generate_token : string -> Token.token_type list

The word by word lexers

val validate_parenthesis_and_quote : Token.token_type list -> Parser.parameters

A function to count the parenthesis and validate if every parenthesis are closed and every quotes are doubled