You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
let x0 = 1
let x1 = (x0,x0)
let x2 = (x1,x1)
...
let x[N] = (x[N-1], x[N-1])
in x[N]
type of x[N] is nested tuple with 2^N leaves. Note it won't use exponential memory because of sharing. Still it's possible to make type checker effectively hang by comparing with type by equality with itself. Maybe it's even possible to break sharing and crash type checker by exhausting memory
It seems only way out is to place limit on depth of nesting in types and tuple size. Such limits are necessarily arbitrary so it's not obvious how they should be chosen.
The text was updated successfully, but these errors were encountered:
On second thought limit by depth doesn't work very well since we can have both function which could have rather deep nesting with very low fanout. Take for example type A → A → ... → A → A which essentially represented as type list, while nested tuples could be very wide and genereate large types with modest nesting. So only viable way to limit size of data type using number of constructors as metric
It doesn't look like good solution to me since it will increase transaction size and we'll have to check whether annotated type is equal to inferred. Lambda though have to be annotated. Effectively this outsources type size check to transaction size check.
On one hand it's possible it likely will solve problem of types blowup. It still could be possible to introduce growing types. While it seems unlikely I cant rule it out. So far I slightly in favor of limiting maximum size of type. Size could be cached in nodes of type ast during its construction and approach is relatively foolproof
P.S. On related note we may also have to beware quadratic performance as well. It's not quite catastrophic as exponential but still has potential to be used as DoS vector
Let consider following program:
type of
x[N]
is nested tuple with 2^N leaves. Note it won't use exponential memory because of sharing. Still it's possible to make type checker effectively hang by comparing with type by equality with itself. Maybe it's even possible to break sharing and crash type checker by exhausting memoryIt seems only way out is to place limit on depth of nesting in types and tuple size. Such limits are necessarily arbitrary so it's not obvious how they should be chosen.
The text was updated successfully, but these errors were encountered: