Skip to content

Commit

Permalink
[REVIEW] Introduce Field and FieldMetadata
Browse files Browse the repository at this point in the history
This commit introduces new data structure for the implementation of
RFC005: records don't just hold values anymore, but optional values
togetehr with metadata. Simple adaptation are done right away, but more
complex issues are postponed to future work.

MetaValue are kept in parallel for the time being.

[REVIEW] Adapt parser infrastrucure to Field

Adapt the parser module to use the new type Field/FieldMetadata from
RFC005. The grammar is also updated to generated them instead of using
the old MetaValue.

[NO REVIEW] Fix errors after dyn fields change

We changed dynamic fields from RichTerm to Field, so that they can carry
out their Metadata. This commit fixes some compilation errors related to
that previous change.

[REVIEW] Handling of `Annotated` AST node

RFC005 will get rid of MetaValue altogether. It is split in metadata
attached to fields (FieldMetadata), and contract and type annotations
which can still appear anywhere, the new Annotated. This commits update
various parts of the code to handle this new AST node.

[REVIEW] handle LetMetadata, fix destructuring

Let can still hold a `doc` item, hence the parser has now a new
intermediate annotation (between FieldMetadata and Annotated):
LetMetadata. This commits deal with it, and fix destructuring to work
with the new RFC005 types.

[NO REVIEW] Fix borrowing errors

[REVIEW] Introduce new RuntimeError term

Because of RFC005, missing field definition error are known in advance,
and with better context, which is great. However, in some situation,
they aren't necessary raised in the hand: for example if the field being
treated is not used, we shouldn't raise the error. This commit adds a
new term to the AST, RuntimeError, which is just a delayed error: if it
is never evaluated, it's fine, but if the evaluator ecounters it, the
error it contains is simply raised.

[NO REVIEW] Fix typo

[REVIEW] Update missing field def error for RFC005

[REVIEW] Propagate field through dynamic extension

Now that we can't carry metadata around inside terms, the dynamic
extension operator (or insertion operator) needs to perform some tricks
to recover the metadata of the original field as well as handling field
without definition.

[NO REVIEW] Fill LetMetadata::combine

[NO REVIEW] Fill MissingFieldError::into_eval_err

[REVIEW] Typechecking Field and Annot

Refactor a bit the typechecking code and fill missing implementations to
andle fields and annotated terms more uniformly.

[REVIEW] Implement merging for `Field`

Make merging act on Fields instead of MetaValue.

[NO REVIEW] Remove old TODOs, solve small ones

[REVIEW] Fix iter_without_opts

iter_without_opts didn't have the right semantics, as it was just
ignoring all fields without definition, optional or not.

[NO REVIEW] Fix pretty printing of dynamic fields

[REVIEW] Field traversal

Add a Traverse trait to structure a bit the big chunk of traverse
functions implemented on various types, and implement Traverse for
Field.

[REVIEW] Update rec priorities for fields

Adapt rec priorities operator to RFC005. For now, we try to maintain the
old semantics. The price to pay is that we have to deep seq the terms to
which the recursive priority is applied to. This is meant to evolve in
later bikeshedding.

[NO REVIEW] Updating done TODOs

[NO REVIEW] Get rid of MetaValue type

[REVIEW] Adapt query to FieldMetadata

Since RFC005, quering just a value doesn't make anymore (random values
don't have metadata). We have to query a field path from a specific
record. This commit reworks the querying infrastructure to handle that.
We recursively evaluate each element of the path to a record, until the
last, from which we extract the corresponding field and access to the
metadata.

[NO REVIEW] Get rid of eval_mode and depr. doc

[NO REVIEW] Get rid of compilation warnings

[REVIEW] Fix recursion limit codegen error

[REVIEW] Evaluation of annotated term

As contract are still generated during the contract application phase,
the evaluation of an annotated term just does nothing: it unwraps the
underlying term.

[NO REVIEW] Fix compilation errors in tests

[REVIEW] Add missing contract app for fields

Update the contract application transformation phase to also generate
applications for the new record fields (Field)

[NO REVIEW] Fix unbound identifier in merging

[NO REVIEW] Fix unbound identifier in rec prio op

[REVIEW] Disable tests after RFC005

One behavior hard to replicate with RFC005, because of how the metadata
are stored now. We disable the test for now.

[NO REVIEW] Fix bug in transforming dyn fields

[NO REVIEW] Fix forcing versions of rec prio op

[NO REVIEW] Remove empty lines from destruct tests

[REVIEW] First phase of fixing merging

Merging has become slightly more complex with RFC005. While everything
(field value, contracts, etc.) was bundled in a MetaValue before, that
you could closurize or revert as one entity, now there are many places
where terms can lie: both in the field's value and the field's metadata.
Those must be individually closurized, reverted, saturated, etc.

[REVIEW] 2nd phase of fixing merging

Follow-up of the previus commit to fix merging in the context of RFC005.
In particular, the share_normal_form transformation is udpated to
introduce revertible bindings for field's metadata as well, as the
contract attached to one field may depend recursively on other fields of
the same record. Various missing closurizing/revert/saturate are added
as well.

[REVIEW] Temporary work-around to fix a test

...while waiting for improvement of dictionary types (namely, that they
are more "lazy")

[REVIEW] Substitute wildcards in fields metadata

Missing substitution of wildcards inside new FieldMetadata.

[REVIEW] Fix parsing of doc as multiline strings

[REVIEW] Fix pretty printing of standard strings

[NO REVIEW] Update snapshots for pretty printing

[REVIEW] Document RFC005 items, fix other doc

[NO REVIEW] Add missing cache argument to merge helpers

[REVIEW] Use a proper type for string rendering style

[FORMAT]

[REVIEW] Temporarily disable recursive priorities test

[NO REVIEW] Update snapshots test for pretty printing

[NO REVIEW] Fix missing import

[NO REVIEW] Fix clippy errors

[NO REVIEW] Post-rebase fixup

[NO REVIEW] Fix the LSP for RFC005

[REVIEW] Fix typechecking of anotated terms

Moving code around made the `walk_with_annot` function traversing terms
with a static annotation two times, and the first in non strict mode.
This would change the behavior of another statically typed code was
present inside, as in the non-strict mode, all variables introduced
in-between would be affected to `Dyn` (mostly). This might also mess up
the LSP, as terms were added two times.

[NO REVIEW] Fix context completion for chained record.

Apply suggestions from code review

Co-authored-by: Gaga <[email protected]>

[DOC] add unwrap/unreachable justification

[DOC] Fix typo

[DOC] Add missing doc to helper function

[NO REVIEW] Replace item with meta in `IdentWithType`

We actually only need the metadata stored in the item to generate
completion data; so keeping only the metadata is more efficient and
easier to deal with.

[NO REVIEW] Display completion data using `FieldMetadata`

Update src/error.rs

Co-authored-by: Viktor Kleen <[email protected]>

Update src/transform/share_normal_form.rs

Co-authored-by: Viktor Kleen <[email protected]>

Update src/transform/share_normal_form.rs

Co-authored-by: Viktor Kleen <[email protected]>

Update src/eval/merge.rs

Co-authored-by: Viktor Kleen <[email protected]>

Update src/eval/merge.rs

Co-authored-by: Viktor Kleen <[email protected]>

Update src/eval/merge.rs

Co-authored-by: Viktor Kleen <[email protected]>

[TO SQUASH] Remove dead code/comment

[NO REVIEW] Post-rebase fixup
  • Loading branch information
yannham committed Feb 8, 2023
1 parent 0db5f2f commit 02d92e7
Show file tree
Hide file tree
Showing 64 changed files with 3,575 additions and 2,417 deletions.
4 changes: 2 additions & 2 deletions benches/nixpkgs/lists.ncl
Original file line number Diff line number Diff line change
Expand Up @@ -392,7 +392,7 @@
let l = array.length xs in
array.generate (fun n => array.elem_at (l - n - 1) xs) l,

#TODO: is there a way to type staticaly?
#TODO: is there a way to type statically?
listDfs | forall a. Bool -> (a -> a -> Bool) -> Array a -> {visited: Array a, rest: Array a ; Dyn}
| doc m%"
Depth-First Search (DFS) for lists `list != []`.
Expand Down Expand Up @@ -470,7 +470,7 @@
else # there are no cycles
{ result = [ (dfsthis| {minimal:Dyn}).minimal ] @ (toporest | {result: Array Dyn}).result, },

#TODO: can it be staticaly typed?
#TODO: can it be statically typed?
sort | forall a. (a -> a -> Bool) -> Array a -> Array a
| doc m%"
Sort a list based on a comparator function which compares two
Expand Down
16 changes: 5 additions & 11 deletions lsp/nls/src/linearization/building.rs
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,7 @@ use log::debug;
use nickel_lang::{
cache::Cache,
identifier::Ident,
term::{MetaValue, RichTerm, Term},
term::record::Field,
typecheck::{linearization::LinearizationState, UnifType},
types::TypeF,
};
Expand Down Expand Up @@ -102,11 +102,11 @@ impl<'b> Building<'b> {
pub(super) fn register_fields(
&mut self,
current_file: FileId,
record_fields: &HashMap<Ident, RichTerm>,
record_fields: &HashMap<Ident, Field>,
record: ItemId,
env: &mut Environment,
) {
for (ident, value) in record_fields.iter() {
for (ident, field) in record_fields.iter() {
let id = ItemId {
file_id: current_file,
index: self.id_gen().get_and_advance(),
Expand All @@ -123,13 +123,7 @@ impl<'b> Building<'b> {
usages: Vec::new(),
value: ValueState::Unknown,
},
meta: match value.term.as_ref() {
Term::MetaValue(meta @ MetaValue { .. }) => Some(MetaValue {
value: None,
..meta.clone()
}),
_ => None,
},
metadata: Some(field.metadata.clone()),
});
let key = *ident;
env.insert(key, id);
Expand All @@ -150,7 +144,7 @@ impl<'b> Building<'b> {
TermKind::Record(ref mut fields) => {
fields.insert(field_ident, reference_id);
}
_ => unreachable!(),
_ => panic!(),
}
}

Expand Down
20 changes: 9 additions & 11 deletions lsp/nls/src/linearization/completed.rs
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
use std::collections::HashMap;

use codespan::{ByteIndex, FileId};
use nickel_lang::{term::MetaValue, typecheck::linearization::LinearizationState};
use nickel_lang::{term::record::FieldMetadata, typecheck::linearization::LinearizationState};

use super::{
interface::{Resolved, TermKind, UsageState, ValueState},
Expand Down Expand Up @@ -157,22 +157,20 @@ impl Completed {
_ => item,
};

if let Some(
meta @ MetaValue {
ref doc,
ref types,
priority,
..
},
) = item.meta.as_ref()
if let Some(FieldMetadata {
doc,
annotation,
priority,
..
}) = item.metadata.as_ref()
{
if let Some(doc) = doc {
extra.push(doc.to_owned());
}
if let Some(types) = types {
if let Some(types) = &annotation.types {
extra.push(types.label.tag.to_string());
}
if let Some(contracts) = meta.contracts_to_string() {
if let Some(contracts) = annotation.contracts_to_string() {
extra.push(contracts);
}

Expand Down
72 changes: 37 additions & 35 deletions lsp/nls/src/linearization/mod.rs
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,10 @@ use nickel_lang::{
cache::ImportResolver,
identifier::Ident,
position::TermPos,
term::{MetaValue, RichTerm, Term, UnaryOp},
term::{
record::{Field, FieldMetadata},
RichTerm, Term, UnaryOp,
},
typecheck::{
linearization::{Linearization, Linearizer},
reporting::{to_type, NameReg},
Expand Down Expand Up @@ -38,13 +41,12 @@ pub struct ItemId {
/// the linearization using the LSP [AnalysisHost]
#[derive(Debug, Clone, PartialEq)]
pub struct LinearizationItem<S: ResolutionState> {
//term_: Box<Term>,
pub env: Environment,
pub id: ItemId,
pub pos: TermPos,
pub ty: S,
pub kind: TermKind,
pub meta: Option<MetaValue>,
pub metadata: Option<FieldMetadata>,
}

/// [Linearizer] used by the LSP
Expand All @@ -57,7 +59,7 @@ pub struct AnalysisHost<'a> {
phantom: PhantomData<&'a usize>,
file: FileId,
env: Environment,
meta: Option<MetaValue>,
meta: Option<FieldMetadata>,
/// Indexing a record will store a reference to the record as
/// well as its fields.
/// [Self::Scope] will produce a host with a single **`pop`ed**
Expand Down Expand Up @@ -112,12 +114,12 @@ impl<'a> Linearizer for AnalysisHost<'a> {
// `record` is the id [LinearizatonItem] of the enclosing record
// `offset` is used to find the [LinearizationItem] representing the field
// Field items are inserted immediately after the record
if !matches!(
term,
Term::Op1(UnaryOp::StaticAccess(_), _) | Term::MetaValue(_)
) {
if !matches!(term, Term::Op1(UnaryOp::StaticAccess(_), _)) {
if let Some((record, (offset, _))) = self
.record_fields
// We call take because each record field will be linearized in a different scope.
// In particular, each record field gets its own copy of `record_fields`, so we can
// take it.
.take()
.map(|(record, mut fields)| (record, fields.pop().unwrap()))
{
Expand Down Expand Up @@ -188,7 +190,7 @@ impl<'a> Linearizer for AnalysisHost<'a> {
ty: ty.clone(),
pos,
kind: TermKind::Structure,
meta: self.meta.take(),
metadata: self.meta.take(),
});

ValueState::Known(ItemId {
Expand Down Expand Up @@ -219,11 +221,11 @@ impl<'a> Linearizer for AnalysisHost<'a> {
ty,
pos: ident.pos,
kind,
meta: None,
metadata: None,
});
}
for matched in destruct.to_owned().inner() {
let (ident, term) = matched.as_meta_field();
let (ident, field) = matched.as_binding();

let id = ItemId {
file_id: self.file,
Expand All @@ -241,13 +243,7 @@ impl<'a> Linearizer for AnalysisHost<'a> {
Vec::new(),
ValueState::Known(id),
),
meta: match &*term.term {
Term::MetaValue(meta) => Some(MetaValue {
value: None,
..meta.clone()
}),
_ => None,
},
metadata: Some(field.metadata),
});
}
}
Expand All @@ -269,7 +265,7 @@ impl<'a> Linearizer for AnalysisHost<'a> {
ty: ty.clone(),
pos,
kind: TermKind::Structure,
meta: self.meta.take(),
metadata: self.meta.take(),
});

ValueState::Known(ItemId {
Expand Down Expand Up @@ -302,7 +298,7 @@ impl<'a> Linearizer for AnalysisHost<'a> {
ty,
pos: ident.pos,
kind,
meta: self.meta.take(),
metadata: self.meta.take(),
});
}
Term::Var(ident) => {
Expand All @@ -323,7 +319,7 @@ impl<'a> Linearizer for AnalysisHost<'a> {
pos: ident.pos,
ty: UnifType::Concrete(TypeF::Dyn),
kind: TermKind::Usage(UsageState::from(self.env.get(&key).copied())),
meta: self.meta.take(),
metadata: self.meta.take(),
});

if let Some(referenced) = self.env.get(&key) {
Expand Down Expand Up @@ -351,7 +347,7 @@ impl<'a> Linearizer for AnalysisHost<'a> {
},
child: accessor.to_owned(),
}),
meta: self.meta.take(),
metadata: self.meta.take(),
});
}
}
Expand All @@ -363,7 +359,7 @@ impl<'a> Linearizer for AnalysisHost<'a> {
pos,
ty,
kind: TermKind::Record(HashMap::new()),
meta: self.meta.take(),
metadata: self.meta.take(),
});

lin.register_fields(self.file, &record.fields, id, &mut self.env);
Expand Down Expand Up @@ -395,17 +391,15 @@ impl<'a> Linearizer for AnalysisHost<'a> {
let x = self.access.get_or_insert(Vec::with_capacity(1));
x.push(ident.to_owned())
}
Term::MetaValue(meta) => {
// Notice 1: No push to lin for the `MetaValue` itself
Term::Annotated(annot, _) => {
// Notice 1: No push to lin for the `FieldMetadata` itself
// Notice 2: we discard the encoded value as anything we
// would do with the value will be handled in the following
// call to [Self::add_term]
if meta.value.is_some() {
self.meta = Some(MetaValue {
value: None,
..meta.to_owned()
})
}
self.meta = Some(FieldMetadata {
annotation: annot.clone(),
..Default::default()
})
}
Term::ResolvedImport(file) => {
fn final_term_pos(term: &RichTerm) -> &TermPos {
Expand Down Expand Up @@ -444,7 +438,7 @@ impl<'a> Linearizer for AnalysisHost<'a> {
pos,
ty,
kind: TermKind::Usage(UsageState::Resolved(term_id)),
meta: self.meta.take(),
metadata: self.meta.take(),
})
}
other => {
Expand All @@ -456,12 +450,20 @@ impl<'a> Linearizer for AnalysisHost<'a> {
pos,
ty,
kind: TermKind::Structure,
meta: self.meta.take(),
metadata: self.meta.take(),
})
}
}
}

fn add_field_metadata(&mut self, _lin: &mut Linearization<Building>, field: &Field) {
// Notice 1: No push to lin for the `FieldMetadata` itself
// Notice 2: we discard the encoded value as anything we
// would do with the value will be handled in the following
// call to [Self::add_term]
self.meta = Some(field.metadata.clone())
}

/// [Self::add_term] produces a depth first representation or the
/// traversed AST. This function indexes items by _source position_.
/// Elements are reorderd to allow efficient lookup of elemts by
Expand Down Expand Up @@ -535,14 +537,14 @@ impl<'a> Linearizer for AnalysisHost<'a> {
pos,
ty,
kind,
meta,
metadata: meta,
}| LinearizationItem {
ty: to_type(&table, &reported_names, &mut NameReg::new(), ty),
env,
id,
pos,
kind,
meta,
metadata: meta,
},
)
.map(|item| LinearizationItem {
Expand Down
Loading

0 comments on commit 02d92e7

Please sign in to comment.