Merge branch 'master' into update-readme

This commit is contained in:
Andrew Callahan 2019-12-05 15:44:46 -05:00 committed by GitHub
commit dc89580fe5
No known key found for this signature in database
GPG key ID: 4AEE18F83AFDEB23
60 changed files with 3108 additions and 575 deletions

View file

@ -1,12 +1,28 @@
# Used by "mix format"
locals_without_parens = [
get: 1,
index: 1,
post: 1,
read: 1,
read: 2,
create: 1,
create: 2,
update: 1,
update: 2,
destroy: 1,
destroy: 2,
actions: 1,
defaults: 1,
attribute: 2,
attribute: 3,
belongs_to: 2,
belongs_to: 3,
has_one: 2,
has_one: 3,
has_many: 2,
field: 2
has_many: 3,
many_to_many: 2,
many_to_many: 3,
resources: 1,
max_page_size: 1,
default_page_size: 1
]
[

21
LICENSE Normal file
View file

@ -0,0 +1,21 @@
MIT License
Copyright (c) [year] [fullname]
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.

View file

@ -8,20 +8,42 @@ Ash builds upon the incredible power of Phoenix and empowers developers to get u
Ash is an open source project, and draws inspiration from similar ideas in other frameworks and concepts. The goal of Ash is to lower the barrier to adopting and using Elixir and Phoenix, and in doing so help these amazing communities attract new develpers, projects, and companies.
## Installation
## Quick Links
* For Resource DSL documentation, see: [Ash.Resource](Ash.Resource.html)
If [available in Hex](https://hex.pm/docs/publish), the package can be installed
by adding `ash` to your list of dependencies in `mix.exs`:
## TODO LIST (in no order)
```elixir
def deps do
[
{:ash, "~> 0.1.0"}
]
end
```
Documentation can be generated with [ExDoc](https://github.com/elixir-lang/ex_doc)
and published on [HexDocs](https://hexdocs.pm). Once published, the docs can
be found at [https://hexdocs.pm/ash](https://hexdocs.pm/ash).
* Make our router cabaple of describing its routes in `mix phx.routes` Chris McCord says that we could probably power that, seeing as phoenix controls both APIs, and that capability could be added to `Plug.Router`
* Finish the serializer
* Make primary key type configurable
* Make a DSL for join tables to support complex validation/hooks into how they work, support more than just table names in `join_through`
* DSL level validations! Things like includes validating that their chain exists. All DSL structs should be strictly validated when they are created.
* Especially at compile time, we should *never* ignore or skip invalid options. If an option is present and invalid, an error is raised.
* break up the `Ash` module
* Wire up/formalize the error handling
* Ensure that errors are properly propagated up from the data_layer behaviour, and every operation is allowed to fail
* figure out the ecto schema warning
* all actions need to be performed in a transaction
* document authorization thoroughly. *batch* (default) checks need to return a list of `ids` for which the check passed.
* So many parts of the system are reliant on things having an `id` key explicitly. THis will need to be addressed some day, and will be a huge pain in the ass
* Validate that the user resource has a get action
* `params` should be solidified. Perhaps as a struct. Or perhaps just renamed to `action_params` where it is used.
* Since actions contain rules now, consider making it possible to list each action as its own `do` block, with an internal DSL for configuring the action. (overkill?)
* Validate rules at creation
* Maybe fix the crappy parts of optimal and bring it in for opts validation?
* The ecto internals that live on structs are going to cause problems w/ pluggability of backends, like the `%Ecto.Association.NotLoaded{}`. That backend may need to scrub the ecto specifics off of those structs.
* Add a mixin compatibility checker framework, to allow for mix_ins to declare what features they do/don't support.
* Have ecto types ask the data layer about the kinds of filtering they can do, and that kind of thing.
* Make `Ash.Type` that is a superset of things like `Ecto.Type`. If we bring in ecto database-less(looking like more and more of a good idea to me) that kind of thing gets easier and we can potentially lean on ecto for type validations well.
* use a process to hold constructed DSL state, and then coalesce it all at the end. This can clean things up, and also allow us to potentially eliminate the registry. This will probably go hand in hand w/ the "capabilities" layer, where the DSL confirms that your data layer is capable of performing everything that your DSL declares
* make ets dep optional
* Bake in descriptions to the DSL
* Contributor guideline and code of conduct
* Do branch analysis of each record after authorizing it, in authorizer
* consider moving `type` and `name` for resources out into json api (or perhaps just `name`) since only json api uses that
* When we support embedding, figure out `embed_as` on `Ash.Type`
* Consider allowing declaring a data layer at the *api* level, or overriding the resource's data layer at the *api* level
* Since actions can return multiple errors, we need a testing utility to unwrap/assert on them
* Flesh out relationship options
* Flesh out field options (sortable, filterable, other behavior?)

View file

@ -1,4 +0,0 @@
use Mix.Config
config :ash,
ecto_repos: [Ash.Repo]

View file

@ -1,25 +1,115 @@
defmodule Ash do
def resources() do
Application.get_env(:ash, :resources) || []
@moduledoc """
The primary interface for interrogating apis and resources.
This is not the code level interface for a resource. Instead, call functions
on an `Api` module that contains those resources.
"""
alias Ash.Resource.Relationships.{BelongsTo, HasOne, HasMany, ManyToMany}
alias Ash.Resource.Actions.{Create, Read, Update, Destroy}
@type record :: struct
@type cardinality_one_relationship() :: HasOne.t() | BelongsTo.t()
@type cardinality_many_relationship() :: HasMany.t() | ManyToMany.t()
@type relationship :: cardinality_one_relationship() | cardinality_many_relationship()
@type query :: struct
@type resource :: module
@type data_layer :: module
@type api :: module
@type error :: struct
@type filter :: map()
@type sort :: Keyword.t()
@type side_loads :: Keyword.t()
@type attribute :: Ash.Attributes.Attribute.t()
@type action :: Create.t() | Read.t() | Update.t() | Destroy.t()
@type side_load_config :: Keyword.t()
@spec resources(api) :: list(resource())
def resources(api) do
api.resources()
end
@spec primary_key(resource()) :: nil | attribute() | list(attribute)
def primary_key(resource) do
resource.primary_key()
end
@spec relationship(resource(), atom() | String.t()) :: relationship() | nil
def relationship(resource, relationship_name) when is_bitstring(relationship_name) do
Enum.find(resource.relationships(), &(to_string(&1.name) == relationship_name))
end
def relationship(resource, relationship_name) do
Enum.find(resource.relationships(), &(&1.name == relationship_name))
end
@spec relationships(resource()) :: list(relationship())
def relationships(resource) do
resource.relationships()
end
@spec side_load_config(api()) :: side_load_config()
def side_load_config(api) do
api.side_load_config()
end
@spec primary_action(resource(), atom()) :: action() | nil
def primary_action(resource, type) do
resource
|> actions()
|> Enum.filter(&(&1.type == type))
|> case do
[action] -> action
actions -> Enum.find(actions, & &1.primary?)
end
end
@spec action(resource(), atom(), atom()) :: action() | nil
def action(resource, name, type) do
Enum.find(resource.actions(), &(&1.name == name && &1.type == type))
end
@spec actions(resource()) :: list(action())
def actions(resource) do
resource.actions()
end
@spec attribute(resource(), String.t() | atom) :: attribute() | nil
def attribute(resource, name) when is_bitstring(name) do
Enum.find(resource.attributes, &(to_string(&1.name) == name))
end
def attribute(resource, name) do
Enum.find(resource.attributes, &(&1.name == name))
end
@spec attributes(resource()) :: list(attribute())
def attributes(resource) do
resource.attributes()
end
@spec name(resource()) :: String.t()
def name(resource) do
resource.name()
end
@spec type(resource()) :: String.t()
def type(resource) do
resource.type()
end
@spec max_page_size(api(), resource()) :: non_neg_integer() | nil
def max_page_size(api, resource) do
min(api.max_page_size(), resource.max_page_size())
end
@spec default_page_size(api(), resource()) :: non_neg_integer() | nil
def default_page_size(api, resource) do
min(api.default_page_size(), resource.default_page_size())
end
@spec data_layer(resource()) :: data_layer()
def data_layer(resource) do
resource.data_layer()
end
end

105
lib/ash/api/api.ex Normal file
View file

@ -0,0 +1,105 @@
defmodule Ash.Api do
defmacro __using__(opts) do
quote bind_quoted: [opts: opts] do
@before_compile Ash.Api
@default_page_size nil
@max_page_size nil
@no_interface !!opts[:no_interface?]
@side_load_type :simple
@side_load_config []
Module.register_attribute(__MODULE__, :mix_ins, accumulate: true)
Module.register_attribute(__MODULE__, :resources, accumulate: true)
Module.register_attribute(__MODULE__, :named_resources, accumulate: true)
import Ash.Api,
only: [
default_page_size: 1,
max_page_size: 1,
resources: 1,
side_load: 2,
side_load: 1
]
end
end
defmacro resources(resources) do
quote do
Enum.map(unquote(resources), fn resource ->
case resource do
{name, resource} ->
@resources resource
@named_resources {name, resource}
resource ->
@resources resource
end
end)
end
end
defmacro side_load(type, config \\ []) do
quote bind_quoted: [type: type, config: config] do
unless type in [:parallel, :simple] do
raise "side_load type must be one if `:parallel` or `:simple`"
end
case type do
:simple ->
@side_load_type :simple
:parallel ->
@side_load_type :parallel
# TODO: validate no extra keys
raise "`:supervisor` option must be set."
@side_load_config [
supervisor: config[:supervisor],
max_concurrency: config[:max_concurrency],
timeout: opts[:timeout],
shutdown: opts[:shutdown]
]
end
end
end
defmacro default_page_size(value) do
quote do
@default_page_size unquote(value)
end
end
defmacro max_page_size(value) do
quote do
@max_page_size unquote(value)
end
end
defmacro __before_compile__(env) do
quote do
def default_page_size(), do: @default_page_size
def max_page_size(), do: @max_page_size
def mix_ins(), do: @mix_ins
def resources(), do: @resources
def side_load_config(), do: {@side_load_type, @side_load_config}
@resources
|> Enum.group_by(&Ash.type/1)
|> Enum.map(fn {type, resources} ->
if Enum.count(resources) > 1 do
raise "multiple resources w/ conflicting type #{type} in #{__MODULE__}"
end
end)
unless @no_interface do
use Ash.Api.Interface
end
Enum.map(@mix_ins || [], fn hook_module ->
code = hook_module.before_compile_hook(unquote(Macro.escape(env)))
Module.eval_quoted(__MODULE__, code)
end)
end
end
end

146
lib/ash/api/interface.ex Normal file
View file

@ -0,0 +1,146 @@
defmodule Ash.Api.Interface do
defmacro __using__(_) do
quote do
def get!(resource, id, params \\ %{}) do
Ash.Api.Interface.get!(__MODULE__, resource, id, params)
end
def get(resource, id, params \\ %{}) do
case Ash.Api.Interface.get(__MODULE__, resource, id, params) do
{:ok, instance} -> {:ok, instance}
{:error, error} -> {:error, List.wrap(error)}
end
end
def read!(resource, params \\ %{}) do
Ash.Api.Interface.read!(__MODULE__, resource, params)
end
def read(resource, params \\ %{}) do
case Ash.Api.Interface.read(__MODULE__, resource, params) do
{:ok, paginator} -> {:ok, paginator}
{:error, error} -> {:error, List.wrap(error)}
end
end
def create!(resource, params \\ %{}) do
Ash.Api.Interface.create!(__MODULE__, resource, params)
end
def create(resource, params \\ %{}) do
case Ash.Api.Interface.create(__MODULE__, resource, params) do
{:ok, instance} -> {:ok, instance}
{:error, error} -> {:error, List.wrap(error)}
end
end
end
end
def get!(api, resource, id, params \\ %{}) do
api
|> get(resource, id, params)
|> unwrap_or_raise!()
end
def get(api, resource, id, params \\ %{}) do
# TODO: Figure out this interface
params_with_filter =
params
|> Map.put_new(:filter, %{})
|> Map.update!(:filter, &Map.put(&1, :id, id))
|> Map.put(:page, %{limit: 2})
case read(api, resource, params_with_filter) do
{:ok, %{results: [single_result]}} ->
{:ok, single_result}
{:ok, %{results: []}} ->
{:ok, nil}
{:error, error} ->
{:error, error}
{:ok, %{results: results}} when is_list(results) ->
{:error, :too_many_results}
end
end
def read!(api, resource, params \\ %{}) do
api
|> read(resource, params)
|> unwrap_or_raise!()
end
def read(api, resource, params \\ %{}) do
params = add_default_page_size(api, params)
case Map.get(params, :action) || Ash.primary_action(resource, :read) do
nil ->
{:error, "no action provided, and no primary action found"}
action ->
Ash.DataLayer.Actions.run_read_action(resource, action, api, params)
end
end
def create!(api, resource, params) do
api
|> create(resource, params)
|> unwrap_or_raise!()
end
def create(api, resource, params) do
case Map.get(params, :action) || Ash.primary_action(resource, :create) do
nil ->
{:error, "no action provided, and no primary action found"}
action ->
Ash.DataLayer.Actions.run_create_action(resource, action, api, params)
end
end
defp unwrap_or_raise!({:ok, result}), do: result
defp unwrap_or_raise!({:error, error}) when is_bitstring(error) do
raise Ash.Error.FrameworkError.exception(message: error)
end
defp unwrap_or_raise!({:error, error}) when not is_list(error) do
raise error
end
defp unwrap_or_raise!({:error, error}) do
combo_message =
error
|> List.wrap()
|> Stream.map(fn error ->
if is_bitstring(error) do
Ash.Error.FrameworkError.exception(message: error)
else
error
end
end)
|> Enum.map_join("\n", &Exception.message/1)
raise Ash.Error.FrameworkError, message: combo_message
end
defp add_default_page_size(_api, %{page: %{limit: value}} = params) when is_integer(value) do
params
end
defp add_default_page_size(api, params) do
case api.default_page_size() do
nil ->
params
page_size ->
Map.update(
params,
:page,
%{limit: api.default_page_size},
&Map.put(&1, :limit, page_size)
)
end
end
end

View file

@ -1,18 +0,0 @@
defmodule Ash.Application do
# See https://hexdocs.pm/elixir/Application.html
# for more information on OTP Applications
@moduledoc false
use Application
def start(_type, _args) do
children = [
{Ash.Repo, []}
]
# See https://hexdocs.pm/elixir/Supervisor.html
# for other strategies and supported options
opts = [strategy: :one_for_one, name: Ash.Supervisor]
Supervisor.start_link(children, opts)
end
end

View file

@ -0,0 +1,130 @@
defmodule Ash.Authorization.Authorizer do
alias Ash.Authorization.Rule
@type result :: :allow | :unauthorized | :undecided
def authorize_precheck(user, rules, context) do
rules
|> Enum.reduce({%{}, []}, fn rule, {instructions, per_check_data} ->
{instructions, check_data} =
rule
|> precheck_result(user, context)
|> List.wrap()
|> Enum.reduce({instructions, %{}}, &handle_precheck_result/2)
{instructions, [check_data | per_check_data]}
end)
|> predict_result(rules)
end
# Never call authorize w/o first calling authorize_precheck before
# the operation
def authorize(user, data, rules, context, per_check_data) do
{_decision, remaining_records} =
rules
|> Enum.zip(per_check_data)
|> Enum.reduce({:undecided, data}, fn
{rule, per_check_data}, {:undecided, data} ->
rule_with_per_check_data =
case per_check_data do
%{precheck: value} ->
%{rule | check: fn _, _, _ -> value end}
_ ->
rule
end
full_context = Map.merge(context, Map.get(per_check_data, :context) || %{})
checked_records = Rule.run_check(rule_with_per_check_data, user, data, full_context)
if Enum.any?(checked_records, &(&1.__authorization_decision__ == :unauthorized)) do
{:unauthorized, data}
else
remaining_records =
Enum.reject(checked_records, &(&1.__authorization_decision__ == :allow))
if Enum.empty?(remaining_records) do
{:allow, []}
else
{:undecided, remaining_records}
end
end
_, {decision, data} ->
{decision, data}
end)
if Enum.empty?(remaining_records) do
:allow
else
# Return some kind of information here?
# Maybe full auth breakdown in dev envs?
{:unauthorized, nil}
end
end
defp predict_result({instructions, per_check_data}, rules) do
prediction = get_prediction(Enum.zip(rules, per_check_data))
{Map.put(instructions, :prediction, prediction), per_check_data}
end
defp get_prediction([]), do: :unknown
defp get_prediction([{rule, %{precheck: value}} | rest]) do
case Rule.result_to_decision(rule.kind, value) do
:allow -> :allow
:undecided -> get_prediction(rest)
:unauthorized -> :unauthorized
end
end
defp get_prediction([{rule, _} | rest]) do
result_if_true = Rule.result_to_decision(rule.kind, true)
result_if_false = Rule.result_to_decision(rule.kind, false)
if result_if_true != :allow and result_if_false != :allow do
:unauthorized
else
get_prediction(rest)
end
end
defp handle_precheck_result(nil, instructions_and_data), do: instructions_and_data
defp handle_precheck_result(:ok, instructions_and_data), do: instructions_and_data
defp handle_precheck_result({:context, context}, {instructions, data}) do
{instructions, Map.update(data, :context, context, &Map.merge(&1, context))}
end
defp handle_precheck_result({:precheck, boolean}, {instructions, data})
when is_boolean(boolean) do
{instructions, Map.put(data, :precheck, boolean)}
end
defp handle_precheck_result({:side_load, relationship}, {instructions, data}) do
new_instructions =
instructions
|> Map.put_new(:side_load, [])
|> Map.update!(:side_load, &Keyword.put_new(&1, relationship, []))
{new_instructions, data}
end
defp precheck_result(%{precheck: nil}, _user, _context), do: nil
defp precheck_result(%{precheck: precheck}, user, context) do
case precheck do
{module, function, args} ->
if function_exported?(module, function, Enum.count(args) + 2) do
apply(module, function, [user, context] ++ args)
else
nil
end
function ->
function.(user, context)
end
end
end

View file

@ -0,0 +1,28 @@
defmodule Ash.Authorization.Check do
@moduledoc """
A behaviour for declaring checks, which can be used to easily construct
authorization rules.
"""
alias Ash.Authorization.Rule
@type options :: Keyword.t()
@callback init(options()) :: {:ok, options()} | {:error, String.t()}
@callback check(Rule.user(), Rule.data(), Rule.context(), options()) :: Rule.resource_ids()
@callback describe(options()) :: String.t()
@callback precheck(Rule.user(), Rule.context(), options()) ::
Rule.precheck_result() | list(Rule.precheck_result())
@optional_callbacks precheck: 3
defmacro __using__(_) do
quote do
@behaviour Ash.Authorization.Check
def init(opts), do: opts
defoverridable init: 1
end
end
end

View file

@ -0,0 +1,110 @@
defmodule Ash.Authorization.Check.RelationshipAccess do
@moduledoc """
Allows the user to access the data if they are related to the resource via the provided relationship.
use `enforce_access?: true` to have the precheck only allow access via the relationship,
or that relationship's foreign keys.
#TODO: Document this better
"""
use Ash.Authorization.Check
def init(opts) do
with {:key, {:ok, relationship}} <- {:key, Keyword.fetch(opts, :relationship)},
{:is_nil, false} <- {:is_nil, is_nil(relationship)},
{:atom, true} <- {:atom, is_atom(relationship)} do
{:ok,
[relationship: relationship, enforce_access?: Keyword.get(opts, :enforce_access?, true)]}
else
{:key, :error} ->
{:error, "Must supply `:relationship` key"}
{:is_nil, true} ->
{:error, "`:relationship` must not be nil"}
{:atom, false} ->
{:error, "`:relationship` must be an atom"}
end
end
def check(nil, _, _, _), do: false
def check(user, data, %{resource: resource}, opts) do
relationship_name = opts[:relationship]
relationship = Ash.relationship(resource, relationship_name)
# The precheck sideloads the relationship
data
|> Stream.filter(fn item ->
item
|> Map.get(relationship_name)
|> Kernel.||([])
|> List.wrap()
|> Enum.find(fn related ->
Map.get(related, relationship.destination_field) == user.id
end)
end)
|> Enum.map(&Map.get(&1, :id))
end
def describe(opts) do
"the current user is the #{opts[:relationship]}"
end
def precheck(nil, _, _), do: {:precheck, false}
def precheck(
user,
%{resource: resource, changeset: changeset, relationships: relationships},
opts
) do
relationship_name = opts[:relationship]
relationship = Ash.relationship(resource, relationship_name)
source_field = relationship.source_field
cond do
Ecto.Changeset.get_field(changeset, source_field) == user.id ->
{:precheck, true}
match?(
%{^relationship_name => relationship_change_value}
when not is_nil(relationship_change_value),
relationships
) ->
related =
relationships
|> Map.get(relationship_name)
|> Enum.find(&(&1.id == user.id))
{:precheck, !!related}
opts[:enforce_access?] ->
{:precheck, false}
true ->
:ok
end
end
def precheck(user, %{resource: resource, params: params}, opts) do
relationship_name = opts[:relationship]
relationship = Ash.relationship(resource, relationship_name)
user_id = user.id
source_field = relationship.source_field
cond do
match?(%{filter: %{^relationship_name => ^user_id}}, params) ->
{:precheck, true}
relationship.type != :many_to_many &&
match?(%{filter: %{^source_field => ^user_id}}, params) ->
{:precheck, true}
opts[:enforce_access?] ->
{:precheck, false}
true ->
{:side_load, relationship_name}
end
end
end

View file

@ -0,0 +1,28 @@
defmodule Ash.Authorization.Check.Static do
@moduledoc """
If this check is reached, it returns the static value provided.
Primarily useful for testing. There is no need to end a rule chain with this.
Instead, you can make the last rule a `deny_only`, or `allow_only` rule.
"""
use Ash.Authorization.Check
def init(opts) do
case opts[:result] do
value when is_boolean(value) ->
{:ok, [result: value]}
_ ->
{:error, "`result` must be a boolean"}
end
end
# in the current design this should technically not be reachable
def check(_, _, _, opts), do: opts[:result]
def describe(opts) do
"the current user is the #{opts[:relationship]}"
end
def precheck(_, _, opts), do: {:precheck, opts[:result]}
end

View file

@ -0,0 +1,51 @@
defmodule Ash.Authorization.Check.UserField do
@moduledoc """
This check allows access if a field on the resource directly matches a
field on the user.
"""
use Ash.Authorization.Check
def init(opts) do
with {:user, {:ok, user_field}} <- {:user, Keyword.fetch(opts, :user_field)},
{:record, {:ok, record_field}} <- {:record, Keyword.fetch(opts, :record_field)} do
{:ok, [record_field: record_field, user_field: user_field]}
else
{:user, :error} -> {:error, "Must supply `:user_field`"}
{:record, :error} -> {:error, "Must supply `:record_field`"}
end
end
def check(nil, _, _, _), do: false
def check(user, data, _, opts) do
user_value = Map.get(user, opts[:user_field])
data
|> Stream.filter(fn record ->
Map.get(record, opts[:record_field]) == user_value
end)
|> Enum.map(& &1.id)
end
def describe(opts) do
"the current user is the #{opts[:relationship]}"
end
def precheck(nil, _, _), do: {:precheck, false}
def precheck(user, %{changeset: changeset}, opts) do
value_will_equal_field? =
changeset
|> Ecto.Changeset.get_field(opts[:record_field])
|> Kernel.==(Map.get(user, opts[:user_field]))
{:precheck, value_will_equal_field?}
end
def precheck(user, context, opts) do
user_value = Map.get(user, opts[:user_field])
record_field = opts[:record_field]
{:precheck, match?(%{params: %{filter: %{^record_field => ^user_value}}}, context)}
end
end

View file

@ -0,0 +1,148 @@
defmodule Ash.Authorization.Rule do
defstruct [:kind, :check, :describe, :precheck]
@type kind :: :allow | :allow_unless | :allow_only | :deny | :deny_unless | :deny_only
@type user :: Ash.user() | nil
@type data :: list(Ash.resource())
@type context :: %{
required(:resource) => Ash.resource(),
required(:action) => Ash.action(),
required(:params) => Ash.params(),
optional(atom) => term
}
@type resource_ids :: list(term)
# Required sideloads before checks are run
@type side_load_instruction :: {:side_load, Ash.side_load()}
# The result for this check is predetermined for all records
# that could be passed in from this request.
@type precheck_instruction :: {:precheck, boolean}
@type precheck_context :: {:context, %{optional(atom) => term}}
@type precheck_result :: side_load_instruction() | precheck_instruction() | precheck_context()
@type check :: {module, atom, list(term)}
@type precheck :: {module, atom, list(term)}
@type describe :: String.t()
@type rule_options :: Keyword.t()
@type t() :: %__MODULE__{
kind: kind(),
check: check(),
describe: describe(),
precheck: precheck() | nil
}
@kinds [
:allow,
:allow_unless,
:allow_only,
:deny,
:deny_unless,
:deny_only
]
@builtin_checks %{
relationship_access: Ash.Authorization.Check.RelationshipAccess,
static: Ash.Authorization.Check.Static,
user_field: Ash.Authorization.Check.UserField
}
@builtin_check_names Map.keys(@builtin_checks)
@doc false
def kinds(), do: @kinds
for kind <- @kinds do
def unquote(kind)(opts) do
new(unquote(kind), opts)
end
def unquote(kind)(check, opts) do
new(unquote(kind), {check, opts})
end
end
def new({kind, opts}), do: new(kind, opts)
def new(kind, opts) when kind not in @kinds do
raise "Invalid rule declaration: #{kind}: #{inspect(opts)}"
end
def new(kind, module) when is_atom(module) do
new(kind, {module, []})
end
def new(kind, {name, opts}) when name in @builtin_check_names() do
new(kind, {Map.get(@builtin_checks, name), opts})
end
def new(kind, {check_module, opts}) when is_list(opts) and is_atom(check_module) do
case check_module.init(opts) do
{:ok, opts} ->
new(kind,
check: {check_module, :check, [opts]},
describe: check_module.describe(opts),
precheck: {check_module, :precheck, [opts]}
)
{:error, error} ->
# TODO: nicer
raise error
end
end
def new(kind, opts) when is_list(opts) do
struct!(__MODULE__, Keyword.put(opts, :kind, kind))
end
def run_check(
%{check: check, kind: kind},
user,
data,
context
) do
check_function =
case check do
{module, function, args} ->
fn user, data, context ->
apply(module, function, [user, data, context] ++ args)
end
function ->
function
end
result = check_function.(user, data, context)
Enum.map(data, fn item ->
result =
case result do
true -> true
false -> false
ids -> item.id in ids
end
decision = result_to_decision(kind, result)
Map.put(item, :__authorization_decision__, decision)
end)
end
@spec result_to_decision(kind(), boolean()) :: Authorizer.result()
def result_to_decision(:allow, true), do: :allow
def result_to_decision(:allow, false), do: :undecided
def result_to_decision(:allow_only, true), do: :allow
def result_to_decision(:allow_only, false), do: :unauthorized
def result_to_decision(:allow_unless, true), do: :undecided
def result_to_decision(:allow_unless, false), do: :allow
def result_to_decision(:deny, true), do: :unauthorized
def result_to_decision(:deny, false), do: :undecided
def result_to_decision(:deny_only, true), do: :unauthorized
def result_to_decision(:deny_only, false), do: :allow
def result_to_decision(:deny_unless, true), do: :undecided
def result_to_decision(:deny_unless, false), do: :unauthorized
end

4
lib/ash/constraints.ex Normal file
View file

@ -0,0 +1,4 @@
defmodule Ash.Constraints do
def positive?(integer), do: integer >= 0
def greater_than_zero?(integer), do: integer > 0
end

View file

@ -0,0 +1,208 @@
defmodule Ash.DataLayer.Actions do
# def run_create_action(resource, action, attributes, relationships, params) do
# case Ash.Data.create(resource, action, attributes, relationships, params) do
# {:ok, record} ->
# Ash.Data.side_load(record, Map.get(params, :side_load, []), resource)
# {:error, error} ->
# {:error, error}
# end
# end
# def run_update_action(%resource{} = record, action, attributes, relationships, params) do
# with {:ok, record} <- Ash.Data.update(record, action, attributes, relationships, params),
# {:ok, [record]} <-
# Ash.Data.side_load([record], Map.get(params, :side_load, []), resource) do
# {:ok, record}
# else
# {:error, error} -> {:error, error}
# end
# end
# def run_destroy_action(record, action, params) do
# Ash.Data.delete(record, action, params)
# end
def run_read_action(resource, action, api, params) do
auth_context = %{
resource: resource,
action: action,
params: params
}
user = Map.get(params, :user)
auth? = Map.get(params, :authorize?, false)
with {%{prediction: prediction} = instructions, per_check_data}
when prediction != :unauthorized <-
maybe_authorize_precheck(auth?, user, action.rules, auth_context),
query <- Ash.DataLayer.resource_to_query(resource),
{:ok, filter} <- Ash.DataLayer.Filter.process(resource, Map.get(params, :filter, %{})),
{:ok, sort} <- Ash.DataLayer.Sort.process(resource, Map.get(params, :sort, [])),
{:ok, filtered_query} <- Ash.DataLayer.filter(query, filter, resource),
{:ok, sorted_query} <- Ash.DataLayer.sort(filtered_query, sort, resource),
{:ok, paginator} <-
Ash.DataLayer.Paginator.paginate(api, resource, action, sorted_query, params),
{:ok, found} <- Ash.DataLayer.run_query(paginator.query, resource),
{:ok, side_loaded_for_auth} <-
Ash.DataLayer.SideLoader.side_load(
resource,
found,
Map.get(instructions, :side_load, []),
api,
Map.take(params, [:authorize?, :user])
),
:allow <-
maybe_authorize(
auth?,
user,
side_loaded_for_auth,
action.rules,
auth_context,
per_check_data
),
{:ok, side_loaded} <-
Ash.DataLayer.SideLoader.side_load(
resource,
side_loaded_for_auth,
Map.get(params, :side_load, []),
api,
Map.take(params, [:authorize?, :user])
) do
{:ok, %{paginator | results: side_loaded}}
else
{:error, error} ->
{:error, error}
{%{prediction: :unauthorized}, _} ->
# TODO: Nice errors here!
{:error, :unauthorized}
{:unauthorized, _data} ->
# TODO: Nice errors here!
{:error, :unauthorized}
end
end
def run_create_action(resource, action, api, params) do
auth_context = %{
resource: resource,
action: action,
params: params
}
user = Map.get(params, :user)
auth? = Map.get(params, :authorize?, false)
# TODO: no instrutions relevant to creates right now?
with {:ok, changeset, relationships} <- prepare_create_params(resource, params),
{%{prediction: prediction}, per_check_data}
when prediction != :unauthorized <-
maybe_authorize_precheck(
auth?,
user,
action.rules,
Map.merge(auth_context, %{changeset: changeset, relationships: relationships})
),
{:ok, created} <-
Ash.DataLayer.create(resource, changeset, relationships),
:allow <-
maybe_authorize(
auth?,
user,
[created],
action.rules,
auth_context,
per_check_data
),
{:ok, side_loaded} <-
Ash.DataLayer.SideLoader.side_load(
resource,
created,
Map.get(params, :side_load, []),
api,
Map.take(params, [:authorize?, :user])
) do
{:ok, side_loaded}
else
%Ecto.Changeset{valid?: false} ->
# TODO: Explain validation problems
{:error, "invalid changes"}
{:error, error} ->
{:error, error}
{%{prediction: :unauthorized}, _} ->
# TODO: Nice errors here!
{:error, :unauthorized}
{:unauthorized, _data} ->
# TODO: Nice errors here!
{:error, :unauthorized}
end
end
defp prepare_create_params(resource, params) do
attributes = Map.get(params, :attributes, %{})
relationships = Map.get(params, :relationships, %{})
with {:ok, changeset} <- prepare_create_attributes(resource, attributes),
{:ok, relationships} <- prepare_create_relationships(resource, relationships) do
{:ok, changeset, relationships}
else
{:error, error} ->
{:error, error}
end
end
defp prepare_create_attributes(resource, attributes) do
allowed_keys =
resource
|> Ash.attributes()
|> Enum.map(& &1.name)
resource
|> struct()
|> Ecto.Changeset.cast(Map.put_new(attributes, :id, Ecto.UUID.generate()), allowed_keys)
|> case do
%{valid?: true} = changeset ->
{:ok, changeset}
_error_changeset ->
# TODO: Print the errors here.
{:error, "invalid attributes"}
end
end
defp prepare_create_relationships(resource, relationships) do
relationships
# Eventually we'll have to just copy changeset's logic
# and/or use it directly (now that ecto is split up, maybe thats the way to do all of this?)
|> Enum.reduce({%{}, []}, fn {key, value}, {changes, errors} ->
case Ash.relationship(resource, key) do
nil ->
{changes, ["unknown attribute #{key}" | errors]}
_attribute ->
# TODO do actual value validation here
{Map.put(changes, key, value), errors}
end
end)
|> case do
{changes, []} -> {:ok, changes}
{_, errors} -> {:error, errors}
end
end
defp maybe_authorize(false, _, _, _, _, _), do: :allow
defp maybe_authorize(true, user, data, rules, auth_context, per_check_data) do
Ash.Authorization.Authorizer.authorize(user, data, rules, auth_context, per_check_data)
end
defp maybe_authorize_precheck(false, _, _, _), do: {%{prediction: :allow}, []}
defp maybe_authorize_precheck(true, user, rules, auth_context) do
Ash.Authorization.Authorizer.authorize_precheck(user, rules, auth_context)
end
end

View file

@ -0,0 +1,149 @@
defmodule Ash.DataLayer do
@callback filter(Ash.query(), Ash.filter(), resource :: Ash.resource()) ::
{:ok, Ash.query()} | {:error, Ash.error()}
@callback sort(Ash.query(), Ash.sort(), resource :: Ash.resource()) ::
{:ok, Ash.query()} | {:error, Ash.error()}
@callback limit(Ash.query(), limit :: non_neg_integer(), resource :: Ash.resource()) ::
{:ok, Ash.query()} | {:error, Ash.error()}
@callback offset(Ash.query(), offset :: non_neg_integer(), resource :: Ash.resource()) ::
{:ok, Ash.query()} | {:error, Ash.error()}
@callback resource_to_query(Ash.resource()) :: Ash.query()
@callback can_query_async?(Ash.resource()) :: boolean
@callback run_query(Ash.query(), Ash.resource()) ::
{:ok, list(Ash.resource())} | {:error, Ash.error()}
@callback create(Ash.resource(), attributes :: map(), relationships :: map()) ::
{:ok, Ash.resource()} | {:error, Ash.error()}
# @callback create(
# Ash.resource(),
# Ash.action(),
# Ash.attributes(),
# Ash.relationships(),
# Ash.params()
# ) ::
# {:ok, Ash.record()} | {:error, Ash.error()}
# @callback update(
# Ash.record(),
# Ash.action(),
# Ash.attributes(),
# Ash.relationships(),
# Ash.params()
# ) ::
# {:ok, Ash.record()} | {:error, Ash.error()}
# @callback delete(Ash.record(), Ash.action(), Ash.params()) ::
# {:ok, Ash.record()} | {:error, Ash.error()}
# @callback append_related(Ash.record(), Ash.relationship(), Ash.resource_identifiers()) ::
# {:ok, Ash.record()} | {:error, Ash.error()}
# @callback delete_related(Ash.record(), Ash.relationship(), Ash.resource_identifiers()) ::
# {:ok, Ash.record()} | {:error, Ash.error()}
# @callback replace_related(Ash.record(), Ash.relationship(), Ash.resource_identifiers()) ::
# {:ok, Ash.record()} | {:error, Ash.error()}
# @spec create(Ash.resource(), Ash.action(), Ash.attributes(), Ash.relationships(), Ash.params()) ::
# {:ok, Ash.record()} | {:error, Ash.error()}
# def create(resource, action, attributes, relationships, params) do
# Ash.data_layer(resource).create(resource, action, attributes, relationships, params)
# end
# @spec update(Ash.record(), Ash.action(), Ash.attributes(), Ash.relationships(), Ash.params()) ::
# {:ok, Ash.record()} | {:error, Ash.error()}
# def update(%resource{} = record, action, attributes, relationships, params) do
# Ash.data_layer(resource).update(record, action, attributes, relationships, params)
# end
# @spec delete(Ash.record(), Ash.action(), Ash.params()) ::
# {:ok, Ash.record()} | {:error, Ash.error()}
# def delete(%resource{} = record, action, params) do
# Ash.data_layer(resource).delete(record, action, params)
# end
# @spec append_related(Ash.record(), Ash.relationship(), Ash.resource_identifiers()) ::
# {:ok, Ash.record()} | {:error, Ash.error()}
# def append_related(%resource{} = record, relationship, resource_identifiers) do
# Ash.data_layer(resource).append_related(record, relationship, resource_identifiers)
# end
# @spec delete_related(Ash.record(), Ash.relationship(), Ash.resource_identifiers()) ::
# {:ok, Ash.record()} | {:error, Ash.error()}
# def delete_related(%resource{} = record, relationship, resource_identifiers) do
# Ash.data_layer(resource).delete_related(record, relationship, resource_identifiers)
# end
# @spec replace_related(Ash.record(), Ash.relationship(), Ash.resource_identifiers()) ::
# {:ok, Ash.record()} | {:error, Ash.error()}
# def replace_related(%resource{} = record, relationship, resource_identifiers) do
# Ash.data_layer(resource).replace_related(record, relationship, resource_identifiers)
# end
@spec resource_to_query(Ash.resource()) :: Ash.query()
def resource_to_query(resource) do
Ash.data_layer(resource).resource_to_query(resource)
end
@spec create(Ash.resource(), Ecto.Changeset.t(), map) ::
{:ok, Ash.record()} | {:error, Ash.error()}
def create(resource, changeset, relationships) do
Ash.data_layer(resource).create(resource, changeset, relationships)
end
@spec filter(Ash.query(), Ash.filter(), Ash.resource()) ::
{:ok, Ash.query()} | {:error, Ash.error()}
def filter(query, filter, resource) do
data_layer = Ash.data_layer(resource)
data_layer.filter(query, filter, resource)
end
@spec sort(Ash.query(), Ash.sort(), Ash.resource()) ::
{:ok, Ash.query()} | {:error, Ash.error()}
def sort(query, sort, resource) do
data_layer = Ash.data_layer(resource)
data_layer.sort(query, sort, resource)
end
@spec limit(Ash.query(), limit :: non_neg_integer, Ash.resource()) ::
{:ok, Ash.query()} | {:error, Ash.error()}
def limit(query, limit, resource) do
data_layer = Ash.data_layer(resource)
data_layer.limit(query, limit, resource)
end
@spec offset(Ash.query(), offset :: non_neg_integer, Ash.resource()) ::
{:ok, Ash.query()} | {:error, Ash.error()}
def offset(query, offset, resource) do
data_layer = Ash.data_layer(resource)
data_layer.limit(query, offset, resource)
end
# @spec get_related(Ash.record(), Ash.relationship()) ::
# {:ok, list(Ash.record()) | Ash.record() | nil} | {:error, Ash.error()}
# def get_related(record, %{cardinality: :many} = relationship) do
# case relationship_query(record, relationship) do
# {:ok, query} ->
# get_many(query, Ash.to_resource(record))
# {:error, error} ->
# {:error, error}
# end
# end
# def get_related(record, %{cardinality: :one} = relationship) do
# case relationship_query(record, relationship) do
# {:ok, query} ->
# get_one(query, Ash.to_resource(record))
# {:error, error} ->
# {:error, error}
# end
# end
@spec run_query(Ash.query(), central_resource :: Ash.resource()) ::
{:ok, list(Ash.record())} | {:error, Ash.error()}
def run_query(query, central_resource) do
Ash.data_layer(central_resource).run_query(query, central_resource)
end
end

227
lib/ash/data_layer/ets.ex Normal file
View file

@ -0,0 +1,227 @@
defmodule Ash.DataLayer.Ets do
@moduledoc """
An ETS backed Ash Datalayer. Should only be used for testing, or for
unimportant/small datasets.
"""
@behaviour Ash.DataLayer
defmacro __using__(opts) do
quote bind_quoted: [opts: opts] do
@data_layer Ash.DataLayer.Ets
@ets_private? Keyword.get(opts, :private?, false)
def ets_private?() do
@ets_private?
end
end
end
def private?(resource) do
resource.ets_private?()
end
defmodule Query do
defstruct [:resource, :filter, :limit, :sort, offset: 0]
end
@impl true
def resource_to_query(resource) do
%Query{
resource: resource
}
end
@impl true
def limit(query, limit, _), do: {:ok, %Query{query | limit: limit}}
@impl true
def offset(query, offset, _), do: {:ok, %{query | offset: offset}}
@impl true
def can_query_async?(_), do: false
@impl true
def filter(query, filter, resource) do
# :ets.fun2ms(fn {_, })
Enum.reduce(filter, {:ok, query}, fn
_, {:error, error} ->
{:error, error}
{key, value}, {:ok, query} ->
do_filter(query, key, value, resource)
end)
end
@impl true
def sort(query, sort, _resource) do
{:ok, %{query | sort: sort}}
end
defp do_filter(query, field, id, _resource) do
{:ok, %{query | filter: Map.put(query.filter || %{}, field, id)}}
end
@impl true
def run_query(
%Query{resource: resource, filter: filter, offset: offset, limit: limit, sort: sort},
_
) do
with {:ok, match_spec} <- filter_to_matchspec(resource, filter),
{:ok, table} <- wrap_or_create_table(resource),
{:ok, results} <- match_limit(table, match_spec, limit, offset),
records <- Enum.map(results, &elem(&1, 1)),
sorted <- do_sort(records, sort),
without_offset <- Enum.drop(sorted, offset) do
{:ok, without_offset}
end
end
defp do_sort(results, empty) when empty in [nil, []], do: results
defp do_sort(results, [{:asc, field}]) do
Enum.sort_by(results, &Map.get(&1, field))
end
defp do_sort(results, [{:desc, field}]) do
results |> Enum.sort_by(&Map.get(&1, field)) |> Enum.reverse()
end
defp do_sort(results, [{:asc, field} | rest]) do
results
|> Enum.group_by(&Map.get(&1, field))
|> Enum.sort_by(fn {key, _value} -> key end)
|> Enum.flat_map(fn {_, records} ->
do_sort(records, rest)
end)
end
defp do_sort(results, [{:desc, field} | rest]) do
results
|> Enum.group_by(&Map.get(&1, field))
|> Enum.sort_by(fn {key, _value} -> key end)
|> Enum.reverse()
|> Enum.flat_map(fn {_, records} ->
do_sort(records, rest)
end)
end
defp filter_to_matchspec(resource, filter) do
starting_matchspec = {{:_, %{__struct__: resource}}, [], [:"$_"]}
filter
|> Kernel.||(%{})
|> Enum.reduce({:ok, {starting_matchspec, 1}}, fn
{key, value}, {:ok, {spec, binding}} ->
do_filter_to_matchspec(resource, key, value, spec, binding)
_, {:error, error} ->
{:error, error}
end)
|> case do
{:error, error} -> {:error, error}
{:ok, {spec, _}} -> {:ok, spec}
end
end
# TODO: Assuming id field, fix at somepoint
defp do_filter_to_matchspec(
_resource,
:id,
id,
{{_, struct_match}, conditions, matcher},
binding
) do
condition = {:==, :"$#{binding}", id}
{:ok, {{{:"$#{binding}", struct_match}, [condition | conditions], matcher}, binding + 1}}
end
defp do_filter_to_matchspec(resource, key, value, spec, binding) do
cond do
attr = Ash.attribute(resource, key) ->
do_filter_to_matchspec_attribute(resource, attr, value, spec, binding)
_rel = Ash.relationship(resource, key) ->
{:error, "relationship filtering not supported"}
true ->
{:error, "unsupported filter"}
end
end
defp do_filter_to_matchspec_attribute(
_resource,
%{name: name},
value,
{{id_match, struct_match}, conditions, matcher},
binding
) do
condition = {:==, :"$#{binding}", value}
new_spec =
{{id_match, Map.put(struct_match, name, :"$#{binding}")}, [condition | conditions], matcher}
{:ok, {new_spec, binding + 1}}
end
@impl true
def create(_resource, _attributes, relationships) when relationships != %{} do
{:error, "#{inspect(__MODULE__)} does not support creating with relationships"}
end
def create(resource, changeset, _relationships) do
with {:ok, table} <- wrap_or_create_table(resource),
record <- Ecto.Changeset.apply_changes(changeset),
{:ok, _} <- Ets.Set.put(table, {record.id, record}) do
{:ok, record}
else
{:error, error} -> {:error, error}
end
end
defp match_limit(table, match_spec, limit, offset) do
# TODO: Fix this
# This is a hack :(
# Either implement cursor based pagination
# or find a way to skip in ETS
result =
if limit do
Ets.Set.select(table, [match_spec], limit + offset)
else
Ets.Set.select(table, [match_spec])
end
case result do
{:ok, {matches, _}} -> {:ok, matches}
{:ok, :"$end_of_table"} -> {:ok, []}
{:error, error} -> {:error, error}
end
end
defp wrap_or_create_table(resource) do
case Ets.Set.wrap_existing(resource) do
{:error, :table_not_found} ->
protection =
if private?(resource) do
:private
else
:public
end
Ets.Set.new(
name: resource,
protection: protection,
ordered: true,
read_concurrency: true
)
{:ok, table} ->
{:ok, table}
{:error, other} ->
{:error, other}
end
end
end

View file

@ -0,0 +1,77 @@
defmodule Ash.DataLayer.Filter do
@filter_types [
:equal
]
@type filter_type :: :equal
@spec filter_types() :: list(filter_type())
def filter_types() do
@filter_types
end
# This logic will need to get more complex as the ability to customize filter handling arises
# as well as when complex filter types are added
def process(resource, filter) do
filter
|> Enum.reduce({%{}, []}, fn {name, value}, {acc, errors} ->
process_filter(resource, name, value, {acc, errors})
end)
|> case do
{filter, []} -> {:ok, filter}
{_, errors} -> {:error, errors}
end
end
# TODO: Look into making `from_related` accept a full filter statement for the source entity,
# so you can say `%{filter: %{from_related: %{owner: %{name: "zach"}}}}. This would let us optimize
# and predict query results better, as well as represent the request to "get" those entities we
# are filtering against as an ash request, so that authorization happens for free :D
defp process_filter(_resource, :from_related, {[], relationship}, {filter, errors})
when is_list(relationship) do
{Map.put(filter, :__impossible__, true), errors}
end
defp process_filter(resource, :from_related, {related, relationship_name}, {filter, errors})
when is_atom(relationship_name) do
case Ash.relationship(resource, relationship_name) do
nil ->
{filter, ["no such relationship: #{relationship_name}" | errors]}
relationship ->
{Map.put(filter, :from_related, {related, relationship}), errors}
end
end
defp process_filter(resource, field, value, {filter, errors}) do
cond do
attr = Ash.attribute(resource, field) ->
process_attribute_filter(resource, attr, value, {filter, errors})
rel = Ash.relationship(resource, field) ->
process_relationship_filter(resource, rel, value, {filter, errors})
true ->
{filter, ["Unsupported filter: #{inspect(field)}" | errors]}
end
end
defp process_attribute_filter(resource, %{name: name, type: type}, value, {filter, errors}) do
with {:ok, casted} <- Ash.Type.cast_input(type, value),
filters <- Ash.Type.supported_filter_types(type, Ash.data_layer(resource)),
{:supported, true} <- {:supported, :equal in filters} do
{Map.put(filter, name, casted), errors}
else
:error ->
{filter, ["Invalid value: #{inspect(value)} for #{inspect(name)}" | errors]}
{:supported, false} ->
{filter, ["Cannot filter #{inspect(name)} for equality." | errors]}
end
end
defp process_relationship_filter(_resource, %{name: name}, value, {filter, errors}) do
# TODO: type validate, potentially expand list of ids into a boolean filter statement
{Map.put(filter, name, value), errors}
end
end

View file

@ -0,0 +1,73 @@
defmodule Ash.DataLayer.Paginator do
defstruct [:limit, :offset, :total, :query, :results]
# TODO: Support more pagination strategies
@type t :: %__MODULE__{
limit: nil | non_neg_integer(),
offset: nil | non_neg_integer(),
total: nil | non_neg_integer(),
query: Ash.query(),
results: nil | list(Ash.resource())
}
@spec paginate(
Ash.api(),
Ash.resource(),
Ash.action(),
Ash.query(),
params :: %{optional(String.t()) => term}
) ::
{:ok, t()} | {:error, Ash.error()}
def paginate(_api, _resource, %{paginate?: false}, query, _params) do
{:ok,
%__MODULE__{
query: query
}}
end
def paginate(api, resource, _action, query, params) do
with {:ok, %__MODULE__{limit: limit, offset: offset} = paginator} <-
paginator(api, resource, params),
{:ok, query} <- Ash.DataLayer.offset(query, offset, resource),
{:ok, query} <- Ash.DataLayer.limit(query, limit, resource) do
{:ok, %{paginator | query: query}}
else
{:error, error} -> {:error, error}
end
end
defp paginator(api, resource, %{page: page}) do
# TODO: Make limit configurable
page_size =
page
|> Map.get(:limit)
|> Kernel.||(Ash.default_page_size(api, resource))
|> Kernel.||(20)
|> Kernel.min(Ash.max_page_size(api, resource))
offset = Map.get(page, :offset, 0)
with {:offset, true} <- {:offset, is_integer(offset) and offset >= 0},
{:limit, true} <- {:limit, is_integer(page_size) and page_size >= 0} do
{:ok,
%__MODULE__{
offset: Map.get(page, :offset, 0),
limit: page_size,
total: nil
}}
else
{:offset, false} -> {:error, "invalid offset"}
{:limit, false} -> {:error, "invalid limit"}
end
end
defp paginator(api, resource, _) do
# TODO: Make limit configurable
{:ok,
%__MODULE__{
offset: 0,
limit: Ash.default_page_size(api, resource) || 20,
total: nil
}}
end
end

View file

@ -0,0 +1,120 @@
defmodule Ash.DataLayer.SideLoader do
def side_load(resource, record, keyword, api, global_params \\ %{})
def side_load(_resource, record_or_records, [], _api, _global_params),
do: {:ok, record_or_records}
def side_load(resource, record, side_loads, api, global_params) when not is_list(record) do
case side_load(resource, [record], side_loads, api, global_params) do
{:ok, [side_loaded]} -> side_loaded
{:error, error} -> {:error, error}
end
end
def side_load(resource, records, side_loads, api, global_params) do
# TODO: No global config!
{side_load_type, config} = Ash.side_load_config(resource)
async? = side_load_type == :parallel
side_loads =
Enum.map(side_loads, fn side_load_part ->
if is_atom(side_load_part) do
{side_load_part, []}
else
side_load_part
end
end)
side_loaded =
side_loads
|> maybe_async_stream(config, async?, fn relationship_name, further ->
relationship = Ash.relationship(resource, relationship_name)
# Combining filters, and handling boolean filters is
# going to come into play here. #TODO
# need to be able to configure options specific to the path of the preload!
action_params =
global_params
|> Map.put(:filter, %{
# TODO: This filter needs to be supported and documented, e.g for authorization
from_related: {records, relationship}
})
|> Map.put_new(:paginate?, false)
with {:ok, related_records} <- api.read(relationship.destination, action_params),
{:ok, %{results: side_loaded_related}} <-
side_load(relationship.destination, related_records, further, global_params) do
keyed_by_id =
Enum.group_by(side_loaded_related, fn record ->
# This is required for many to many relationships
Map.get(record, :__related_id__) ||
Map.get(record, relationship.destination_field)
end)
Enum.map(records, fn record ->
related_to_this_record =
Map.get(keyed_by_id, Map.get(record, relationship.source_field)) || []
unwrapped =
if relationship.cardinality == :many do
related_to_this_record
else
List.first(related_to_this_record)
end
related_ids = Enum.map(related_to_this_record, fn record -> record.id end)
linked_record =
record
|> Map.put(relationship_name, unwrapped)
|> Map.put_new(:__linkage__, %{})
|> Map.update!(:__linkage__, &Map.put(&1, relationship_name, related_ids))
{:ok, linked_record}
end)
else
{:error, error} -> {:error, error}
end
end)
|> List.flatten()
# This is dumb, should handle these errors better
first_error =
Enum.find(side_loaded, fn side_loaded ->
match?({:error, _error}, side_loaded)
end)
first_error || {:ok, Enum.map(side_loaded, &elem(&1, 1))}
end
defp maybe_async_stream(preloads, _opts, false, function) do
Enum.map(preloads, fn {association, further} ->
function.(association, further)
end)
end
defp maybe_async_stream(preloads, opts, true, function) do
# We could theoretically do one of them outside of a task whlie we wait for the rest
# Not worth implementing to start, IMO.
opts = [
opts[:max_concurrency] || System.schedulers_online(),
ordered: false,
timeout: opts[:timeout] || :timer.seconds(5),
on_timeout: :kill_task,
shutdown: opts[:shutdown] || :timer.seconds(5)
]
opts[:supervisor]
|> Task.Supervisor.async_stream_nolink(
preloads,
fn {key, further} -> function.(key, further) end,
opts
)
|> Stream.map(&to_result/1)
end
defp to_result({:exit, reason}), do: {:error, {:exit, reason}}
defp to_result({:ok, {:ok, value}}), do: {:ok, value}
defp to_result({:ok, {:error, error}}), do: {:error, error}
end

View file

@ -0,0 +1,31 @@
defmodule Ash.DataLayer.Sort do
def process(_resource, empty) when empty in [nil, []], do: {:ok, []}
def process(resource, sort) when is_list(sort) do
sort
|> Enum.reduce({[], []}, fn
{order, field}, {sorts, errors} when order in [:asc, :desc] ->
attribute = Ash.attribute(resource, field)
cond do
!attribute ->
{sorts, ["no such attribute: #{field}" | errors]}
!Ash.Type.sortable?(attribute.type, Ash.data_layer(resource)) ->
{sorts, ["Cannot sort on #{inspect(field)}"]}
true ->
{sorts ++ [{order, field}], errors}
end
sort, {sorts, errors} ->
{sorts, ["invalid sort: #{inspect(sort)}" | errors]}
end)
|> case do
{sorts, []} -> {:ok, sorts}
{_, errors} -> {:error, errors}
end
end
def process(_resource, _), do: {:error, "invalid sort"}
end

View file

@ -0,0 +1,3 @@
defmodule Ash.Error.FrameworkError do
defexception [:message]
end

View file

@ -0,0 +1,22 @@
defmodule Ash.Error.ResourceDslError do
defexception [:message, :path, :option, :resource, :using]
def message(%{message: message, path: nil, option: option, resource: resource, using: using}) do
"#{inspect(resource)}: `use #{inspect(using)}, ...` #{option} #{message} "
end
def message(%{message: message, path: nil, option: option, resource: resource}) do
"#{inspect(resource)}: #{option} #{message}"
end
def message(%{message: message, path: dsl_path, option: nil, resource: resource}) do
dsl_path = Enum.join(dsl_path, "->")
"#{inspect(resource)}: #{message} at #{dsl_path}"
end
def message(%{message: message, path: dsl_path, option: option, resource: resource}) do
dsl_path = Enum.join(dsl_path, "->")
"#{inspect(resource)}: option #{option} at #{dsl_path} #{message}"
end
end

View file

@ -1,27 +0,0 @@
defmodule Ash.JsonApi.Controllers.Get do
def init(options) do
# initialize options
options
end
def call(%{path_params: %{"id" => id}} = conn, options) do
resource = options[:resource]
request = Ash.Request.from(conn, resource, :get)
case Ash.Repo.get(resource, id) do
nil ->
conn
# |> put_resp_content_type("text/plain")
|> Plug.Conn.send_resp(404, "uh oh")
found ->
serialized = Ash.JsonApi.Serializer.serialize_one(request, found)
conn
|> Plug.Conn.put_resp_content_type("application/vnd.api+json")
|> Plug.Conn.send_resp(200, serialized)
end
|> Plug.Conn.halt()
end
end

View file

@ -1,23 +0,0 @@
defmodule Ash.JsonApi.Controllers.Index do
def init(options) do
# initialize options
options
end
def call(conn, options) do
resource = options[:resource]
request = Ash.Request.from(conn, resource, :index)
paginator = Ash.JsonApi.Paginator.paginate(request, resource)
found = Ash.Repo.all(paginator.query)
serialized = Ash.JsonApi.Serializer.serialize_many(request, paginator, found)
conn
|> Plug.Conn.put_resp_content_type("application/vnd.api+json")
|> Plug.Conn.send_resp(200, serialized)
|> Plug.Conn.halt()
end
end

View file

@ -1,12 +0,0 @@
defmodule Ash.JsonApi.Controllers.NoRouteFound do
def init(options) do
# initialize options
options
end
def call(conn, _options) do
conn
|> Plug.Conn.send_resp(404, "no route found")
|> Plug.Conn.halt()
end
end

View file

@ -1,15 +0,0 @@
defmodule Ash.JsonApi do
# Honestly, at some point json api should probably be its own thing
defmacro build_routes(scope) do
quote do
require Ash.JsonApi.RouteBuilder
scope unquote(scope) do
for resource <- Ash.resources() do
Ash.JsonApi.RouteBuilder.build_resource_routes(resource)
end
end
end
end
end

View file

@ -1,41 +0,0 @@
defmodule Ash.JsonApi.Paginator do
defstruct [:limit, :offset, :total, :query]
require Ecto.Query
def paginate(request, query) do
paginator = paginator(request)
limit = paginator.limit
offset = paginator.offset
new_query =
query
|> Ecto.Query.offset(^offset)
|> Ecto.Query.limit(^limit)
%{paginator | query: new_query}
end
defp paginator(%{query_params: %{"page" => page}}) do
# TODO: Make limit configurable
%__MODULE__{
offset: Map.get(page, "offset", 0) |> to_integer(),
limit: Map.get(page, "limit", 20) |> to_integer(),
total: nil
}
end
defp paginator(_) do
# TODO: Make limit configurable
%__MODULE__{
offset: 0,
limit: 20,
total: nil
}
end
defp to_integer(value) when is_bitstring(value) do
String.to_integer(value)
end
defp to_integer(value) when is_integer(value), do: value
end

View file

@ -1,24 +0,0 @@
defmodule Ash.JsonApi.RouteBuilder do
defmacro build_resource_routes(resource) do
quote bind_quoted: [resource: resource] do
Ash.JsonApi.RouteBuilder.build_get_route(resource)
Ash.JsonApi.RouteBuilder.build_index_route(resource)
end
end
defmacro build_get_route(resource) do
quote bind_quoted: [resource: resource] do
for %{expose?: true, type: :get, path: path} = action <- Ash.actions(resource) do
get(path, to: Ash.JsonApi.Controllers.Get, init_opts: [resource: resource])
end
end
end
defmacro build_index_route(resource) do
quote bind_quoted: [resource: resource] do
for %{expose?: true, type: :index, path: path} = action <- Ash.actions(resource) do
get(path, to: Ash.JsonApi.Controllers.Index, init_opts: [resource: resource])
end
end
end
end

View file

@ -1,28 +0,0 @@
defmodule Ash.JsonApi.Router do
defmacro __using__(_) do
quote do
# TODO: Make it so that these can have their routes printed
# And get that into phoenix
use Plug.Router
require Ash.JsonApi.RouteBuilder
plug(:match)
plug(Plug.Parsers,
parsers: [:json],
pass: ["application/json"],
json_decoder: Jason
)
plug(:dispatch)
for resource <- Ash.resources() do
Code.ensure_compiled(resource)
Ash.JsonApi.RouteBuilder.build_resource_routes(resource)
end
match(_, to: Ash.JsonApi.Controllers.NoRouteFound)
end
end
end

View file

@ -1,176 +0,0 @@
defmodule Ash.JsonApi.Serializer do
alias Ash.Request
def serialize_many(request, paginator, records) do
data = Enum.map(records, &serialize_one_record(request, &1))
json_api = %{version: "1.0"}
links = many_links(request, paginator)
Jason.encode!(%{data: data, json_api: json_api, links: links})
end
def serialize_one(request, record) do
# TODO `links` and `included`
data = serialize_one_record(request, record)
json_api = %{version: "1.0"}
links = one_links(request)
Jason.encode!(%{data: data, json_api: json_api, links: links})
end
defp many_links(%{url: url} = request, paginator) do
uri = URI.parse(request.url)
query = Plug.Conn.Query.decode(uri.query || "")
%{
first: first_link(uri, query, paginator),
self: url
}
|> add_last_link(uri, query, paginator)
|> add_prev_link(uri, query, paginator)
|> add_next_link(uri, query, paginator)
end
defp first_link(uri, query, paginator) do
new_query =
query
|> Map.put("page", %{
limit: paginator.limit,
offset: 0
})
|> Plug.Conn.Query.encode()
uri
|> Map.put(:query, new_query)
|> URI.to_string()
end
defp add_next_link(links, _uri, _query, %{offset: offset, limit: limit, total: total})
when not is_nil(total) and offset + limit >= total,
do: links
defp add_next_link(links, uri, query, %{offset: offset, limit: limit}) do
new_query =
query
|> Map.put("page", %{
limit: limit + offset,
offset: offset
})
|> Plug.Conn.Query.encode()
link =
uri
|> Map.put(:query, new_query)
|> URI.to_string()
Map.put(links, :next, link)
end
defp add_next_link(links, uri, query, paginator) do
new_query =
query
|> Map.put("page", %{
limit: paginator.limit,
offset: 0
})
|> Plug.Conn.Query.encode()
link =
uri
|> Map.put(:query, new_query)
|> URI.to_string()
Map.put(links, :prev, link)
end
defp add_prev_link(links, _uri, _query, %{offset: 0}), do: links
defp add_prev_link(links, uri, query, paginator) do
new_query =
query
|> Map.put("page", %{
limit: paginator.limit,
offset: 0
})
|> Plug.Conn.Query.encode()
link =
uri
|> Map.put(:query, new_query)
|> URI.to_string()
Map.put(links, :prev, link)
end
defp add_last_link(links, _uri, _query, %{total: nil}) do
links
end
defp add_last_link(links, uri, query, %{total: total, limit: limit}) do
new_query =
query
|> Map.put("page", %{
limit: limit,
offset: total - limit
})
|> Plug.Conn.Query.encode()
link =
uri
|> Map.put(:query, new_query)
|> URI.to_string()
Map.put(links, "last", link)
end
defp one_links(request) do
%{
self: request.url
}
end
defp serialize_one_record(%Request{resource: resource} = request, record) do
# TODO: `relationships` `meta`
%{
id: record.id,
type: Ash.type(resource),
attributes: serialize_attributes(resource, record),
relationships: serialize_relationships(resource, record),
links: %{
self: at_host(request, Ash.Routes.get(resource, record.id))
}
}
end
defp serialize_relationships(resource, _record) do
# TODO: links.self, links.related
resource
|> Ash.relationships()
|> Enum.into(%{}, fn relationship ->
value = %{
links: %{},
data: %{},
meta: %{}
}
{relationship.name, value}
end)
end
defp at_host(request, route) do
request.url
|> URI.parse()
|> Map.put(:query, nil)
|> Map.put(:path, "/" <> Path.join(request.json_api_prefix, route))
|> URI.to_string()
end
defp serialize_attributes(resource, record) do
resource
|> Ash.attributes()
|> Keyword.delete(:id)
|> Enum.reduce(%{}, fn attribute, acc ->
Map.put(acc, attribute.name, Map.get(record, attribute.name))
end)
end
end

View file

@ -1,25 +0,0 @@
defmodule Ash.Repo do
use Ecto.Repo,
# TODO: Is this wrong? Maybe not? Maybe better to only configure priv/other things
otp_app: Application.get_env(:ash, :otp_app),
adapter: Ecto.Adapters.Postgres
def init(_type, config) do
database_name = Application.fetch_env!(:ash, :database_name)
database_username = Application.fetch_env!(:ash, :database_username)
database_password = Application.fetch_env!(:ash, :database_password)
database_hostname = Application.fetch_env!(:ash, :database_hostname)
# TODO configurable
migration_primary_key = [name: :id, type: :binary_id]
new_config =
config
|> Keyword.put(:database, database_name)
|> Keyword.put(:username, database_username)
|> Keyword.put(:password, database_password)
|> Keyword.put(:hostname, database_hostname)
|> Keyword.put(:migration_primary_key, migration_primary_key)
{:ok, new_config}
end
end

View file

@ -1,28 +0,0 @@
defmodule Ash.Request do
require Logger
defstruct [
:action,
:resource,
:route,
:path_params,
:query_params,
:url,
:json_api_prefix
]
def from(conn, resource, action) do
request = %__MODULE__{
resource: resource,
action: action,
url: Plug.Conn.request_url(conn),
path_params: conn.path_params,
query_params: conn.query_params,
json_api_prefix: Application.get_env(:ash, :json_api_prefix) || ""
}
Logger.info("Got request: #{inspect(request)}")
request
end
end

View file

@ -1,35 +1,142 @@
defmodule Ash.Resource do
@primary_key_schema Ashton.schema(
opts: [field: :atom, type: :atom],
defaults: [field: :id, type: :uuid],
describe: [
field: "The field name of the primary key of the resource.",
type: "The data type of the primary key of the resource."
]
)
@resource_opts_schema Ashton.schema(
opts: [
name: :string,
type: :string,
max_page_size: :integer,
default_page_size: :integer,
primary_key: [
:boolean,
@primary_key_schema
]
],
describe: [
name:
"The name of the resource. This will typically be the pluralized form of the type",
type:
"The type of the resource, e.g `post` or `author`. This is used throughout the system.",
max_page_size:
"The maximum page size for any read action. Any request for a higher page size will simply use this number.",
default_page_size:
"The default page size for any read action. If no page size is specified, this value is used.",
primary_key:
"If true, a default `id` uuid primary key is used. If false, none is created. See the primary_key opts for info on specifying primary key options."
],
required: [:name, :type],
defaults: [
primary_key: true
],
constraints: [
max_page_size:
{&Ash.Constraints.greater_than_zero?/1, "must be greater than zero"},
default_page_size:
{&Ash.Constraints.greater_than_zero?/1, "must be greater than zero"}
]
)
@moduledoc """
The entry point for creating an `Ash.Resource`.
This brings in the top level DSL macros, defines module attributes for aggregating state as
DSL functions are called, and defines a set of functions internal to the resource that can be
used to inspect them.
Simply add `use Ash.Resource, ...` at the top of your resource module, and refer to the DSL
documentation for the rest. The options for `use Ash.Resource` are described below.
Resource DSL documentation: `Ash.Resource.DSL`
#{Ashton.document(@resource_opts_schema)}
Note:
*Do not* call the functions on a resource, as in `MyResource.type()` as this is a *private*
API and can change at any time. Instead, use the `Ash` module, for example: `Ash.type(MyResource)`
"""
defmacro __using__(opts) do
quote do
@before_compile Ash.Resource
Module.register_attribute(__MODULE__, :actions, accumulate: true)
Module.register_attribute(__MODULE__, :attributes, accumulate: true)
Module.register_attribute(__MODULE__, :relationships, accumulate: true)
opts = Ash.Resource.validate_use_opts(__MODULE__, unquote(opts))
Ash.Resource.define_resource_module_attributes(__MODULE__, opts)
Ash.Resource.define_primary_key(__MODULE__, opts)
@attributes Ash.Resource.Attributes.Attribute.new(:id, :uuid)
# Module.put_attribute(__MODULE__, :custom_threshold_for_lib, 10)
import Ash.Resource
import Ash.Resource.Actions, only: [actions: 1]
import Ash.Resource.Attributes, only: [attributes: 1]
import Ash.Resource.Relationships, only: [relationships: 1]
name = unquote(opts[:name])
resource_type = unquote(opts[:type])
@name name
@resource_type resource_type
use Ash.Resource.DSL
end
end
defmacro __before_compile__(_env) do
@doc false
def define_resource_module_attributes(mod, opts) do
Module.register_attribute(mod, :before_compile_hooks, accumulate: true)
Module.register_attribute(mod, :actions, accumulate: true)
Module.register_attribute(mod, :attributes, accumulate: true)
Module.register_attribute(mod, :relationships, accumulate: true)
Module.register_attribute(mod, :mix_ins, accumulate: true)
Module.put_attribute(mod, :name, opts[:name])
Module.put_attribute(mod, :resource_type, opts[:type])
Module.put_attribute(mod, :max_page_size, opts[:max_page_size])
Module.put_attribute(mod, :default_page_size, opts[:default_page_size])
Module.put_attribute(mod, :data_layer, nil)
end
@doc false
def define_primary_key(mod, opts) do
case opts[:primary_key] do
true ->
attribute = Ash.Resource.Attributes.Attribute.new(mod, :id, :uuid, primary_key?: true)
Module.put_attribute(mod, :attributes, attribute)
false ->
:ok
opts ->
attribute =
Ash.Resource.Attributes.Attribute.new(mod, opts[:field], opts[:type], primary_key?: true)
Module.put_attribute(mod, :attributes, attribute)
end
end
@doc false
def validate_use_opts(mod, opts) do
case Ashton.validate(opts, @resource_opts_schema) do
{:error, [{key, message} | _]} ->
raise Ash.Error.ResourceDslError,
resource: mod,
using: __MODULE__,
option: key,
message: message
{:ok, opts} ->
opts
end
end
defmacro __before_compile__(env) do
quote do
if __MODULE__ not in Ash.resources() do
raise "Your module (#{inspect(__MODULE__)}) must be in config, :ash, resources: [...]"
@sanitized_actions Ash.Resource.mark_primaries(@actions)
@ash_primary_key Ash.Resource.primary_key(@attributes)
unless @ash_primary_key do
raise "Must have a primary key for a resource: #{__MODULE__}"
end
require Ash.Resource.Schema
require Ash.Schema
Ash.Schema.define_schema(@name)
def type() do
@resource_type
@ -40,18 +147,83 @@ defmodule Ash.Resource do
end
def actions() do
@actions
@sanitized_actions
end
def attributes() do
@attributes
end
def primary_key() do
@ash_primary_key
end
def name() do
@name
end
Ash.Resource.Schema.define_schema(@name)
def mix_ins() do
@mix_ins
end
def max_page_size() do
@max_page_size
end
def default_page_size() do
@default_page_size
end
def data_layer() do
@data_layer
end
Enum.map(@mix_ins || [], fn hook_module ->
code = hook_module.before_compile_hook(unquote(Macro.escape(env)))
Module.eval_quoted(__MODULE__, code)
end)
end
end
@doc false
def primary_key(attributes) do
attributes
|> Enum.filter(& &1.primary_key?)
|> Enum.map(& &1.name)
|> case do
[] ->
nil
[single] ->
single
other ->
other
end
end
@doc false
def mark_primaries(all_actions) do
all_actions
|> Enum.group_by(& &1.type)
|> Enum.flat_map(fn {type, actions} ->
case actions do
[action] ->
[%{action | primary?: true}]
actions ->
case Enum.count(actions, & &1.primary?) do
0 ->
# TODO: Format these prettier
raise "Must declare a primary action for #{type}, as there are more than one."
1 ->
actions
_ ->
raise "Duplicate primary actions declared for #{type}, but there can only be one primary action."
end
end
end)
end
end

View file

@ -1,12 +0,0 @@
defmodule Ash.Resource.Actions.Action do
defstruct [:expose?, :type, :name, :path]
def new(name, type, opts \\ []) do
%__MODULE__{
name: name,
expose?: opts[:expose?] || false,
type: type,
path: opts[:path] || to_string(name)
}
end
end

View file

@ -2,27 +2,108 @@ defmodule Ash.Resource.Actions do
defmacro actions(do: block) do
quote do
import Ash.Resource.Actions
import Ash.Authorization.Rule,
only: [
allow: 1,
allow: 2,
allow_unless: 1,
allow_unless: 2,
allow_only: 1,
allow_only: 2,
deny: 1,
deny: 2,
deny_unless: 1,
deny_unless: 2,
deny_only: 1,
deny_only: 2
]
unquote(block)
import Ash.Resource.Actions, only: [actions: 1]
import Ash.Authorization.Rule, only: []
end
end
defmacro get(opts) do
defmacro defaults(:all) do
quote do
name = unquote(opts[:name]) || :get
# TODO: do this somewhere centrally somewhere else
path = Path.join("#{@name}/", unquote(opts[:path]) || "/:id")
expose? = unquote(opts[:expose?]) || false
@actions Ash.Resource.Actions.Action.new(name, :get, expose?: expose?, path: path)
defaults([:create, :update, :destroy, :read])
end
end
defmacro index(opts) do
defmacro defaults(defaults, opts \\ []) do
quote do
name = unquote(opts[:name]) || :index
path = "#{@name}/"
expose? = unquote(opts[:expose?]) || false
@actions Ash.Resource.Actions.Action.new(name, :index, expose?: expose?, path: path)
opts = unquote(opts)
for default <- unquote(defaults) do
case default do
:create ->
create(:default, opts)
:update ->
update(:default, opts)
:destroy ->
destroy(:default, opts)
:read ->
read(:default, opts)
action ->
raise "Invalid action type #{action} listed in defaults list for resource: #{
__MODULE__
}"
end
end
end
end
defmacro create(name, opts \\ []) do
quote bind_quoted: [name: name, opts: opts] do
action =
Ash.Resource.Actions.Create.new(name,
primary?: opts[:primary?] || false,
rules: opts[:rules] || []
)
@actions action
end
end
defmacro update(name, opts \\ []) do
quote bind_quoted: [name: name, opts: opts] do
action =
Ash.Resource.Actions.Update.new(name,
primary?: opts[:primary?] || false,
rules: opts[:rules] || []
)
@actions action
end
end
defmacro destroy(name, opts \\ []) do
quote bind_quoted: [name: name, opts: opts] do
action =
Ash.Resource.Actions.Destroy.new(name,
primary?: opts[:primary?] || false,
rules: opts[:rules] || []
)
@actions action
end
end
defmacro read(name, opts \\ []) do
quote bind_quoted: [name: name, opts: opts] do
action =
Ash.Resource.Actions.Read.new(name,
primary?: opts[:primary?] || false,
rules: opts[:rules] || [],
paginate?: Keyword.get(opts, :paginate?, true)
)
@actions action
end
end
end

View file

@ -0,0 +1,12 @@
defmodule Ash.Resource.Actions.Create do
defstruct [:type, :name, :primary?, :rules]
def new(name, opts \\ []) do
%__MODULE__{
name: name,
type: :create,
primary?: opts[:primary?],
rules: opts[:rules]
}
end
end

View file

@ -0,0 +1,12 @@
defmodule Ash.Resource.Actions.Destroy do
defstruct [:type, :name, :primary?, :rules]
def new(name, opts \\ []) do
%__MODULE__{
name: name,
type: :destroy,
primary?: opts[:primary?],
rules: opts[:rules]
}
end
end

View file

@ -0,0 +1,13 @@
defmodule Ash.Resource.Actions.Read do
defstruct [:type, :name, :primary?, :paginate?, :rules]
def new(name, opts \\ []) do
%__MODULE__{
name: name,
type: :read,
primary?: opts[:primary?],
paginate?: opts[:paginate?],
rules: opts[:rules]
}
end
end

View file

@ -0,0 +1,12 @@
defmodule Ash.Resource.Actions.Update do
defstruct [:type, :name, :primary?, :rules]
def new(name, opts \\ []) do
%__MODULE__{
name: name,
type: :update,
primary?: opts[:primary?],
rules: opts[:rules]
}
end
end

View file

@ -1,18 +1,66 @@
defmodule Ash.Resource.Attributes.Attribute do
defstruct [:name, :type, :ecto_type]
@doc false
def new(name, type, _opts \\ []) do
ecto_type =
if type == :uuid do
:binary_id
else
type
end
defstruct [:name, :type, :primary_key?]
%__MODULE__{
name: name,
type: type,
ecto_type: ecto_type
}
@type t :: %__MODULE__{
name: atom(),
type: Ash.type(),
primary_key?: boolean()
}
@builtins Ash.Type.builtins()
@schema Ashton.schema(opts: [primary_key?: :boolean], defaults: [primary_key?: false])
@doc false
def attribute_schema(), do: @schema
def new(resource, name, type, opts \\ [])
def new(resource, name, _, _) when not is_atom(name) do
raise Ash.Error.ResourceDslError,
resource: resource,
message: "Attribute name must be an atom, got: #{inspect(name)}",
path: [:attributes, :attribute]
end
def new(resource, _name, type, _opts) when not is_atom(type) do
raise Ash.Error.ResourceDslError,
resource: resource,
message: "Attribute type must be a built in type or a type module, got: #{inspect(type)}",
path: [:attributes, :attribute]
end
def new(resource, name, type, opts) when type in @builtins do
case Ashton.validate(opts, @schema) do
{:error, [{key, message} | _]} ->
raise Ash.Error.ResourceDslError,
resource: resource,
message: message,
path: [:attributes, :attribute],
option: key
{:ok, opts} ->
%__MODULE__{
name: name,
type: type,
primary_key?: opts[:primary_key?] || false
}
end
end
def new(resource, name, type, opts) do
if Ash.Type.ash_type?(type) do
%__MODULE__{
name: name,
type: type,
primary_key?: opts[:primary_key?] || false
}
else
raise Ash.Error.ResourceDslError,
resource: resource,
message: "Attribute type must be a built in type or a type module, got: #{inspect(type)}",
path: [:attributes, :attribute]
end
end
end

View file

@ -7,9 +7,9 @@ defmodule Ash.Resource.Attributes do
end
end
defmacro attribute(name, type) do
quote bind_quoted: [type: type, name: name] do
@attributes Ash.Resource.Attributes.Attribute.new(name, type)
defmacro attribute(name, type, opts \\ []) do
quote bind_quoted: [type: type, name: name, opts: opts] do
@attributes Ash.Resource.Attributes.Attribute.new(__MODULE__, name, type, opts)
end
end
end

42
lib/ash/resource/dsl.ex Normal file
View file

@ -0,0 +1,42 @@
defmodule Ash.Resource.DSL do
@moduledoc """
The entrypoint for the Ash DSL documentation and interface.
Available DSL sections:
* `actions` - `Ash.Resource.Actions`
* `attributes` - `Ash.Resource.Attributes`
* `relationships` - `Ash.Resource.Relationships`
See the relevant module documentation. To use sections in your resource:
```elixir
defmodule MyModule do
use Ash.Resource, name: "foos", type: "foo"
actions do
...
# see actions documentation
end
attributes do
...
# see attributes documentation
end
relationships do
...
# see relationships documentation
end
end
```
"""
defmacro __using__(_) do
quote do
import Ash.Resource.Actions, only: [actions: 1]
import Ash.Resource.Attributes, only: [attributes: 1]
import Ash.Resource.Relationships, only: [relationships: 1]
end
end
end

View file

@ -1,13 +1,43 @@
defmodule Ash.Resource.Relationships.BelongsTo do
defstruct [:name, :type, :destination, :destination_field, :source_field]
defstruct [
:name,
:cardinality,
:type,
:path,
:destination,
:primary_key?,
:side_load,
:destination_field,
:source_field
]
@type t :: %__MODULE__{
type: :belongs_to,
cardinality: :one
}
@spec new(
resource_name :: String.t(),
name :: atom,
related_resource :: Ash.resource(),
opts :: Keyword.t()
) :: t()
def new(resource_name, name, related_resource, opts \\ []) do
path = opts[:path] || resource_name <> "/:id/" <> to_string(name)
def new(name, related_resource, opts \\ []) do
%__MODULE__{
name: name,
type: :belongs_to,
cardinality: :one,
path: path,
primary_key?: Keyword.get(opts, :primary_key, false),
destination: related_resource,
destination_field: opts[:destination_field] || "id",
source_field: opts[:source_field] || "#{name}_id"
destination_field: atomize(opts[:destination_field] || "id"),
source_field: atomize(opts[:source_field] || "#{name}_id"),
side_load: opts[:side_load]
}
end
defp atomize(value) when is_atom(value), do: value
defp atomize(value) when is_bitstring(value), do: String.to_atom(value)
end

View file

@ -0,0 +1,41 @@
defmodule Ash.Resource.Relationships.HasMany do
defstruct [
:name,
:type,
:cardinality,
:side_load,
:path,
:destination,
:destination_field,
:source_field
]
@type t :: %__MODULE__{
type: :has_many,
cardinality: :many
}
@spec new(
resource_name :: String.t(),
name :: atom,
related_resource :: Ash.resource(),
opts :: Keyword.t()
) :: t()
def new(resource_name, resource_type, name, related_resource, opts \\ []) do
path = opts[:path] || resource_name <> "/:id/" <> to_string(name)
%__MODULE__{
name: name,
type: :has_many,
cardinality: :many,
path: path,
destination: related_resource,
destination_field: atomize(opts[:destination_field] || "#{resource_type}_id"),
source_field: atomize(opts[:source_field] || "id"),
side_load: opts[:side_load]
}
end
defp atomize(value) when is_atom(value), do: value
defp atomize(value) when is_bitstring(value), do: String.to_atom(value)
end

View file

@ -1,13 +1,53 @@
defmodule Ash.Resource.Relationships.HasOne do
defstruct [:name, :type, :destination, :destination_field, :source_field]
@doc false
defstruct [
:name,
:type,
:cardinality,
:destination,
:destination_field,
:source_field
]
@type t :: %__MODULE__{
type: :has_one,
cardinality: :one
}
@opt_schema Ashton.schema(
opts: [
destination_field: :atom,
source_field: :atom
],
defaults: [
source_field: :id
],
describe: [
destination_field:
"The field on the related resource that should match the `source_field` on this resource. Default: <resource.name>_id",
source_field:
"The field on this resource that should match the `destination_field` on the related resource."
]
)
@doc false
def opt_schema(), do: @opt_schema
@spec new(
resource_name :: String.t(),
name :: atom,
related_resource :: Ash.resource(),
opts :: Keyword.t()
) :: t()
@doc false
def new(resource_name, name, related_resource, opts \\ []) do
%__MODULE__{
name: name,
type: :has_one,
cardinality: :one,
destination: related_resource,
destination_field: opts[:destination_field] || "#{resource_name}_id",
source_field: opts[:source_field] || "id"
destination_field: opts[:destination_field] || :"#{resource_name}_id",
source_field: opts[:source_field] || :id
}
end
end

View file

@ -0,0 +1,96 @@
defmodule Ash.Resource.Relationships.ManyToMany do
defstruct [
:name,
:type,
:through,
:cardinality,
:side_load,
:path,
:destination,
:source_field,
:destination_field,
:source_field_on_join_table,
:destination_field_on_join_table,
:join_table_fields
]
@type t :: %__MODULE__{
type: :many_to_many,
cardinality: :many
}
@spec new(
resource_name :: String.t(),
name :: atom,
related_resource :: Ash.resource(),
opts :: Keyword.t()
) :: t()
def new(resource_name, name, related_resource, opts \\ []) do
path = opts[:path] || resource_name <> "/:id/" <> to_string(name)
source_field_on_join_table =
atomize(opts[:source_field_on_join_table] || String.to_atom(resource_name <> "_id"))
destination_field_on_join_table =
opts[:destination_field_on_join_table] ||
raise """
Must set `:destination_field_on_join_table` for #{resource_name}.#{name} as it cannot currently be derived.
"""
source_field = atomize(opts[:source_field] || :id)
destination_field = atomize(opts[:destination_field] || :id)
through =
through!(
opts,
source_field_on_join_table,
destination_field_on_join_table
)
%__MODULE__{
name: name,
type: :many_to_many,
cardinality: :many,
path: path,
through: through,
side_load: opts[:side_load],
destination: related_resource,
source_field: source_field,
destination_field: destination_field,
source_field_on_join_table: source_field_on_join_table,
destination_field_on_join_table: destination_field_on_join_table,
join_table_fields: opts[:join_table_fields] || []
}
end
defp atomize(value) when is_atom(value), do: value
defp atomize(value) when is_bitstring(value), do: String.to_atom(value)
defp through!(opts, _source_field_on_join_table, _destination_field_on_join_table) do
case opts[:through] do
through when is_atom(through) ->
through
# TODO: do this check at runtime. When done at compilation, it forces the modules
# to be compiled, which causes warnings in ecto.
# case Ash.primary_key(through) do
# [^source_field_on_join_table, ^destination_field_on_join_table] ->
# through
# [^destination_field_on_join_table, ^source_field_on_join_table] ->
# through
# other ->
# raise "The primary key of a join table must be the same as the fields that are used for joining. Needed: #{
# inspect([destination_field_on_join_table, source_field_on_join_table])
# } got #{other}"
# end
through when is_bitstring(through) ->
through
_ ->
raise "`:through` option must be a string representing a join table or a module representinga resource"
end
end
end

View file

@ -1,4 +1,19 @@
defmodule Ash.Resource.Relationships do
@moduledoc """
DSL components for declaring relationships.
Relationships are a core component of resource oriented design. Many components of Ash
will use these relationships. A simple use case is side_loading (done via the `side_load`
option, given to an api action). A more complex use case might be building authorization
rules that grant access to a resource based on how the user is related to it.
Available configurations:
`has_one/3`
`belongs_to/3`
`has_many/3`
`many_to_many/3`
"""
defmacro relationships(do: block) do
quote do
import Ash.Resource.Relationships
@ -7,54 +22,86 @@ defmodule Ash.Resource.Relationships do
end
end
defmacro has_one(relationship_name, resource, config \\ []) do
alias Ash.Resource.Relationships.HasOne
@doc """
Declares a has_one relationship. In a relationsal database, the foreign key would be on the *other* table.
Generally speaking, a `has_one` also implies that the destination table is unique on that foreign key.
Example:
```elixir
# In a resource called `Word`
has_one :dictionary_entry, DictionaryEntry,
source_field: :text,
destination_field: :word_text
```
#{Ashton.document(HasOne.opt_schema())}
"""
defmacro has_one(relationship_name, resource, opts \\ []) do
quote do
@relationships Ash.Resource.Relationships.HasOne.new(
@name,
unquote(relationship_name),
unquote(resource),
unquote(config)
)
relationship =
HasOne.new(
@name,
unquote(relationship_name),
unquote(resource),
unquote(opts)
)
@relationships relationship
end
end
defmacro belongs_to(relationship_name, resource, config \\ []) do
quote do
@relationships Ash.Resource.Relationships.BelongsTo.new(
unquote(relationship_name),
unquote(resource),
unquote(config)
)
relationship =
Ash.Resource.Relationships.BelongsTo.new(
@name,
unquote(relationship_name),
unquote(resource),
unquote(config)
)
# TODO: This assumes binary_id
@attributes Ash.Resource.Attributes.Attribute.new(
__MODULE__,
relationship.source_field,
:uuid,
primary_key?: relationship.primary_key?
)
@relationships relationship
end
end
# defmacro has_many(name, resource, config \\ []) do
# quote do
# @relationships Keyword.put(@relationships, unquote(name),
# type: :has_many,
# resource: unquote(resource),
# config: unquote(config)
# )
# end
# end
defmacro has_many(relationship_name, resource, config \\ []) do
quote do
relationship =
Ash.Resource.Relationships.HasMany.new(
@name,
@resource_type,
unquote(relationship_name),
unquote(resource),
unquote(config)
)
# defmacro belongs_to(name, resource, config \\ []) do
# quote do
# @relationships Keyword.put(@relationships, unquote(name),
# type: :belongs_to,
# resource: unquote(resource),
# config: unquote(config)
# )
# end
# end
@relationships relationship
end
end
# defmacro many_to_many(name, resource, config \\ []) do
# quote do
# @relationships Keyword.put(@relationships, unquote(name),
# type: :many_to_many,
# resource: unquote(resource),
# config: unquote(config)
# )
# end
# end
defmacro many_to_many(relationship_name, resource, config \\ []) do
quote do
relationship =
Ash.Resource.Relationships.ManyToMany.new(
@name,
unquote(relationship_name),
unquote(resource),
unquote(config)
)
@relationships relationship
end
end
end

View file

@ -1,4 +1,4 @@
defmodule Ash.Resource.Schema do
defmodule Ash.Schema do
defmacro define_schema(name) do
quote do
use Ecto.Schema
@ -8,12 +8,40 @@ defmodule Ash.Resource.Schema do
schema unquote(name) do
for attribute <- @attributes do
unless attribute.name == :id do
field attribute.name, attribute.ecto_type
field(attribute.name, Ash.Type.ecto_type(attribute.type))
end
end
for relationship <- Enum.filter(@relationships, &(&1.type == :belongs_to)) do
belongs_to relationship.name, relationship.destination
belongs_to(relationship.name, relationship.destination,
define_field: false,
foreign_key: relationship.source_field,
references: relationship.destination_field
)
end
for relationship <- Enum.filter(@relationships, &(&1.type == :has_one)) do
has_one(relationship.name, relationship.destination,
foreign_key: relationship.destination_field,
references: relationship.source_field
)
end
for relationship <- Enum.filter(@relationships, &(&1.type == :has_many)) do
has_many(relationship.name, relationship.destination,
foreign_key: relationship.destination_field,
references: relationship.source_field
)
end
for relationship <- Enum.filter(@relationships, &(&1.type == :many_to_many)) do
many_to_many(relationship.name, relationship.destination,
join_through: relationship.through,
join_keys: [
{relationship.source_field_on_join_table, relationship.source_field},
{relationship.destination_field_on_join_table, relationship.destination_field}
]
)
end
end
end

View file

@ -1,9 +0,0 @@
defmodule Ash.Routes do
def get(resource, id) do
index(resource) <> "/" <> to_string(id)
end
def index(resource) do
"/" <> Ash.name(resource)
end
end

202
lib/ash/type/type.ex Normal file
View file

@ -0,0 +1,202 @@
defmodule Ash.Type do
@moduledoc """
This behaviour is a superset of the Ecto.Type behavior, that also contains
API level information, like what kinds of filters are allowed. Eventually,
this may be used for composite types or serialization.
Much better to `use Ash.Type` than to say `@behaviour Ash.Type` and define
everything yourself.
"""
@callback supported_filter_types(Ash.data_layer()) :: list(Ash.DataLayer.Filter.filter_type())
@callback sortable?(Ash.data_layer()) :: boolean
@callback storage_type() :: Ecto.Type.t()
@callback ecto_type() :: Ecto.Type.t()
@callback cast_input(term) :: {:ok, term} | {:error, keyword()} | :error
@callback cast_stored(term) :: {:ok, term} | :error
@callback dump_to_native(term) :: {:ok, term} | :error
@callback equal?(term, term) :: boolean
@callback describe() :: String.t()
@builtins [
string: [ecto_type: :string, filters: [:equal], sortable?: true],
uuid: [ecto_type: :binary_id, filters: [:equal], sortable?: true],
utc_datetime: [ecto_type: :utc_datetime, filters: [:equal], sortable?: true]
]
@builtin_names Keyword.keys(@builtins)
@type t :: module | atom
@doc """
Returns a list of filter types supported by this type. By default, a type supports only the `:equal` filter
"""
@spec supported_filter_types(t, Ash.data_layer()) ::
list(Ash.DataLayer.Filter.filter_type())
def supported_filter_types(type, _data_layer) when type in @builtin_names do
@builtins[type][:filters]
end
def supported_filter_types(type, data_layer), do: type.supported_filter_types(data_layer)
@doc """
Determines whether or not this value can be sorted.
"""
@spec sortable?(t, Ash.data_layer()) :: boolean
def sortable?(type, _data_layer) when type in @builtin_names do
@builtins[type][:sortable?]
end
def sortable?(type, data_layer), do: type.sortable?(data_layer)
@doc """
Returns the *underlying* storage type (the underlying type of the *ecto type* of the *ash type*)
"""
@spec storage_type(t()) :: Ecto.Type.t()
def storage_type(type), do: type.storage_type()
@doc """
Returns the ecto compatible type for an Ash.Type.
If you `use Ash.Type`, this is created for you. For builtin types
this may return a corresponding ecto builtin type (atom)
"""
@spec ecto_type(t) :: Ecto.Type.t()
for {name, builtin} <- @builtins do
def ecto_type(unquote(name)), do: unquote(builtin[:ecto_type])
end
def ecto_type(type) do
type.ecto_type()
end
@doc """
Casts input (e.g. unknown) data to an instance of the type, or errors
Maps to `Ecto.Type.cast/2`
"""
@spec cast_input(t(), term) :: {:ok, term} | {:error, keyword()} | :error
def cast_input(type, term) when type in @builtin_names do
Ecto.Type.cast(@builtins[term][:ecto_type], term)
end
def cast_input(type, term) do
type.cast_input(term)
end
@doc """
Casts a value from the data store to an instance of the type, or errors
Maps to `Ecto.Type.load/2`
"""
@spec cast_stored(t(), term) :: {:ok, term} | {:error, keyword()} | :error
def cast_stored(type, term) when type in @builtin_names do
Ecto.Type.load(@builtins[type][:ecto_type], term)
end
def cast_stored(type, term) do
type.cast_stored(term)
end
@doc """
Casts a value from the Elixir type to a value that the data store can persist
Maps to `Ecto.Type.dump/2`
"""
@spec dump_to_native(t(), term) :: {:ok, term} | {:error, keyword()} | :error
def dump_to_native(type, term) when type in @builtin_names do
Ecto.Type.dump(@builtins[type][:ecto_type], term)
end
def dump_to_native(type, term) do
type.dump_to_native(term)
end
@doc """
Determines if two values of a given type are equal.
Maps to `Ecto.Type.equal?/3`
"""
@spec equal?(t(), term, term) :: boolean
def equal?(type, left, right) when type in @builtin_names do
Ecto.Type.equal?(@builtins[type][:ecto_type], left, right)
end
def equal?(type, left, right) do
type.equal?(left, right)
end
# @callback equal?(term, term) :: boolean
defmacro __using__(_) do
quote do
@behaviour Ash.Type
parent = __MODULE__
defmodule EctoType do
@behaviour Ecto.Type
@parent parent
@impl true
def type do
@parent.storage_type()
end
@impl true
def cast(term) do
@parent.cast_input(term)
end
@impl true
def load(term) do
@parent.cast_stored(term)
end
@impl true
def dump(term) do
@parent.dump_to_native(term)
end
@impl true
def equal?(left, right) do
@parent.equal?(left, right)
end
@impl true
def embed_as(_), do: :self
end
@impl true
def ecto_type(), do: EctoType
@impl true
def supported_filter_types(_data_layer), do: [:equal]
@impl true
def sortable?(_data_layer), do: true
@impl true
def equal?(left, right), do: left == right
defoverridable supported_filter_types: 1, equal?: 2, sortable?: 1
end
end
@doc "A list of the built in type names"
def builtins(), do: @builtin_names
@doc "Returns true if the value is a builtin type or adopts the `Ash.Type` behaviour"
def ash_type?(atom) when atom in @builtin_names, do: true
def ash_type?(module) do
:erlang.function_exported(module, :module_info, 0) and ash_type_module?(module)
end
defp ash_type_module?(module) do
:attributes
|> module.module_info()
|> Keyword.get(:behaviour, [])
|> Enum.any?(&(&1 == __MODULE__))
end
end

42
mix.exs
View file

@ -1,31 +1,55 @@
defmodule Ash.MixProject do
use Mix.Project
@description """
A resource declaration and interaction library. Built with pluggable data layers, and
designed to be used by multiple front ends.
"""
def project do
[
app: :ash,
version: "0.1.0",
elixir: "~> 1.9",
start_permanent: Mix.env() == :prod,
deps: deps()
elixirc_paths: elixirc_paths(Mix.env()),
package: package(),
deps: deps(),
docs: docs(),
description: @description,
source_url: "https://github.com/ash-project/ash",
homepage_url: "https://github.com/ash-project/ash"
]
end
# Run "mix help compile.app" to learn about applications.
def application do
defp docs() do
# The main page in the docs
[main: "readme", extras: ["README.md"]]
end
defp package do
[
extra_applications: [:logger],
mod: {Ash.Application, []}
name: :ash,
licenses: ["MIT"],
links: %{
GitHub: "https://github.com/ash-project/ash"
}
]
end
defp elixirc_paths(:test) do
["lib", "test/support"]
end
defp elixirc_paths(_), do: ["lib"]
# Run "mix help deps" to learn about dependencies.
defp deps do
[
{:ecto_sql, "~> 3.0"},
{:postgrex, ">= 0.0.0"},
{:plug, "~> 1.8"},
{:jason, "~> 1.1"}
{:ecto, "~> 3.0"},
{:ets, github: "zachdaniel/ets", ref: "b96da05e75926e340e8a0fdfea9c095d97ed8d50"},
{:ex_doc, "~> 0.21", only: :dev, runtime: false},
{:ashton, "~> 0.3.9"}
]
end
end

View file

@ -1,11 +1,18 @@
%{
"ashton": {:hex, :ashton, "0.3.9", "1c089d62d35a17c1f31db4e9130fb90f8d802c8c9078fd29138be7b6b93305b5", [:mix], [], "hexpm"},
"connection": {:hex, :connection, "1.0.4", "a1cae72211f0eef17705aaededacac3eb30e6625b04a6117c1b2db6ace7d5976", [:mix], [], "hexpm"},
"dataloader": {:hex, :dataloader, "1.0.6", "fb724d6d3fb6acb87d27e3b32dea3a307936ad2d245faf9cf5221d1323d6a4ba", [:mix], [{:ecto, ">= 0.0.0", [hex: :ecto, repo: "hexpm", optional: true]}], "hexpm"},
"db_connection": {:hex, :db_connection, "2.1.1", "a51e8a2ee54ef2ae6ec41a668c85787ed40cb8944928c191280fe34c15b76ae5", [:mix], [{:connection, "~> 1.0.2", [hex: :connection, repo: "hexpm", optional: false]}], "hexpm"},
"decimal": {:hex, :decimal, "1.8.0", "ca462e0d885f09a1c5a342dbd7c1dcf27ea63548c65a65e67334f4b61803822e", [:mix], [], "hexpm"},
"ecto": {:hex, :ecto, "3.2.2", "bb6d1dbcd7ef975b60637e63182e56f3d7d0b5dd9c46d4b9d6183a5c455d65d1", [:mix], [{:decimal, "~> 1.6", [hex: :decimal, repo: "hexpm", optional: false]}, {:jason, "~> 1.0", [hex: :jason, repo: "hexpm", optional: true]}], "hexpm"},
"ecto_sql": {:hex, :ecto_sql, "3.2.0", "751cea597e8deb616084894dd75cbabfdbe7255ff01e8c058ca13f0353a3921b", [:mix], [{:db_connection, "~> 2.1", [hex: :db_connection, repo: "hexpm", optional: false]}, {:ecto, "~> 3.2.0", [hex: :ecto, repo: "hexpm", optional: false]}, {:myxql, "~> 0.2.0", [hex: :myxql, repo: "hexpm", optional: true]}, {:postgrex, "~> 0.15.0", [hex: :postgrex, repo: "hexpm", optional: true]}, {:telemetry, "~> 0.4.0", [hex: :telemetry, repo: "hexpm", optional: false]}], "hexpm"},
"earmark": {:hex, :earmark, "1.4.3", "364ca2e9710f6bff494117dbbd53880d84bebb692dafc3a78eb50aa3183f2bfd", [:mix], [], "hexpm"},
"ecto": {:hex, :ecto, "3.2.5", "76c864b77948a479e18e69cc1d0f0f4ee7cced1148ffe6a093ff91eba644f0b5", [:mix], [{:decimal, "~> 1.6", [hex: :decimal, repo: "hexpm", optional: false]}, {:jason, "~> 1.0", [hex: :jason, repo: "hexpm", optional: true]}], "hexpm"},
"ets": {:git, "https://github.com/zachdaniel/ets.git", "b96da05e75926e340e8a0fdfea9c095d97ed8d50", [ref: "b96da05e75926e340e8a0fdfea9c095d97ed8d50"]},
"ex_doc": {:hex, :ex_doc, "0.21.2", "caca5bc28ed7b3bdc0b662f8afe2bee1eedb5c3cf7b322feeeb7c6ebbde089d6", [:mix], [{:earmark, "~> 1.3.3 or ~> 1.4", [hex: :earmark, repo: "hexpm", optional: false]}, {:makeup_elixir, "~> 0.14", [hex: :makeup_elixir, repo: "hexpm", optional: false]}], "hexpm"},
"jason": {:hex, :jason, "1.1.2", "b03dedea67a99223a2eaf9f1264ce37154564de899fd3d8b9a21b1a6fd64afe7", [:mix], [{:decimal, "~> 1.0", [hex: :decimal, repo: "hexpm", optional: true]}], "hexpm"},
"makeup": {:hex, :makeup, "1.0.0", "671df94cf5a594b739ce03b0d0316aa64312cee2574b6a44becb83cd90fb05dc", [:mix], [{:nimble_parsec, "~> 0.5.0", [hex: :nimble_parsec, repo: "hexpm", optional: false]}], "hexpm"},
"makeup_elixir": {:hex, :makeup_elixir, "0.14.0", "cf8b7c66ad1cff4c14679698d532f0b5d45a3968ffbcbfd590339cb57742f1ae", [:mix], [{:makeup, "~> 1.0", [hex: :makeup, repo: "hexpm", optional: false]}], "hexpm"},
"mime": {:hex, :mime, "1.3.1", "30ce04ab3175b6ad0bdce0035cba77bba68b813d523d1aac73d9781b4d193cf8", [:mix], [], "hexpm"},
"nimble_parsec": {:hex, :nimble_parsec, "0.5.2", "1d71150d5293d703a9c38d4329da57d3935faed2031d64bc19e77b654ef2d177", [:mix], [], "hexpm"},
"plug": {:hex, :plug, "1.8.3", "12d5f9796dc72e8ac9614e94bda5e51c4c028d0d428e9297650d09e15a684478", [:mix], [{:mime, "~> 1.0", [hex: :mime, repo: "hexpm", optional: false]}, {:plug_crypto, "~> 1.0", [hex: :plug_crypto, repo: "hexpm", optional: false]}, {:telemetry, "~> 0.4", [hex: :telemetry, repo: "hexpm", optional: true]}], "hexpm"},
"plug_crypto": {:hex, :plug_crypto, "1.0.0", "18e49317d3fa343f24620ed22795ec29d4a5e602d52d1513ccea0b07d8ea7d4d", [:mix], [], "hexpm"},
"postgrex": {:hex, :postgrex, "0.15.1", "23ce3417de70f4c0e9e7419ad85bdabcc6860a6925fe2c6f3b1b5b1e8e47bf2f", [:mix], [{:connection, "~> 1.0", [hex: :connection, repo: "hexpm", optional: false]}, {:db_connection, "~> 2.1", [hex: :db_connection, repo: "hexpm", optional: false]}, {:decimal, "~> 1.5", [hex: :decimal, repo: "hexpm", optional: false]}, {:jason, "~> 1.0", [hex: :jason, repo: "hexpm", optional: true]}], "hexpm"},

123
test/actions/read_test.exs Normal file
View file

@ -0,0 +1,123 @@
defmodule Ash.Test.Actions.ReadTest do
use ExUnit.Case, async: true
# TODO: test the bang versions of read actions
defmodule Post do
use Ash.Resource, name: "posts", type: "post"
use Ash.DataLayer.Ets, private?: true
actions do
defaults [:read, :create]
end
attributes do
attribute :title, :string
attribute :contents, :string
end
end
defmodule Api do
use Ash.Api
resources [Post]
end
describe "api.get/3" do
setup do
{:ok, post} = Api.create(Post, %{attributes: %{title: "test", contents: "yeet"}})
%{post: post}
end
test "it returns a matching record", %{post: post} do
assert {:ok, fetched_post} = Api.get(Post, post.id)
assert fetched_post == post
end
test "it returns nil when there is no matching record" do
assert {:ok, nil} = Api.get(Post, Ecto.UUID.generate())
end
end
describe "Ash.read/2 with no records" do
test "returns an empty result" do
assert {:ok, %{results: []}} = Api.read(Post)
end
end
describe "Ash.read/2" do
setup do
{:ok, post1} = Api.create(Post, %{attributes: %{title: "test", contents: "yeet"}})
{:ok, post2} = Api.create(Post, %{attributes: %{title: "test1", contents: "yeet2"}})
%{post1: post1, post2: post2}
end
test "with page size of 1, returns only 1 record" do
assert {:ok, %{results: [_post]}} = Api.read(Post, %{page: %{limit: 1}})
end
test "with page size of 2, returns 2 records" do
assert {:ok, %{results: [_, _]}} = Api.read(Post, %{page: %{limit: 2}})
end
test "with page size of 1 and an offset of 1, it returns 1 record" do
assert {:ok, %{results: [_]}} = Api.read(Post, %{page: %{limit: 1, offset: 1}})
end
end
describe "filters" do
setup do
{:ok, post1} = Api.create(Post, %{attributes: %{title: "test", contents: "yeet"}})
{:ok, post2} = Api.create(Post, %{attributes: %{title: "test1", contents: "yeet"}})
%{post1: post1, post2: post2}
end
test "a filter that matches nothing returns no results" do
assert {:ok, %{results: []}} = Api.read(Post, %{filter: %{contents: "not_yeet"}})
end
test "a filter returns only matching records", %{post1: post1} do
assert {:ok, %{results: [^post1]}} = Api.read(Post, %{filter: %{title: post1.title}})
end
test "a filter returns multiple records if they match", %{post1: post1, post2: post2} do
assert {:ok, %{results: [_, _] = results}} = Api.read(Post, %{filter: %{contents: "yeet"}})
assert post1 in results
assert post2 in results
end
end
describe "sort" do
setup do
{:ok, post1} = Api.create(Post, %{attributes: %{title: "abc", contents: "abc"}})
{:ok, post2} = Api.create(Post, %{attributes: %{title: "xyz", contents: "abc"}})
%{post1: post1, post2: post2}
end
test "a sort will sort the rows accordingly when ascending", %{
post1: post1,
post2: post2
} do
assert {:ok, %{results: [^post1, ^post2]}} = Api.read(Post, %{sort: [asc: :title]})
end
test "a sort will sor rows accordingly when descending", %{
post1: post1,
post2: post2
} do
assert {:ok, %{results: [^post2, ^post1]}} = Api.read(Post, %{sort: [desc: :title]})
end
test "a nested sort sorts accordingly", %{post1: post1, post2: post2} do
{:ok, middle_post} = Api.create(Post, %{attributes: %{title: "abc", contents: "xyz"}})
assert {:ok, %{results: [^post1, ^middle_post, ^post2]}} =
Api.read(Post, %{sort: [asc: :title, asc: :contents]})
end
end
end

View file

@ -1,8 +1,4 @@
defmodule AshTest do
use ExUnit.Case
doctest Ash
test "greets the world" do
assert Ash.hello() == :world
end
end

View file

@ -0,0 +1,57 @@
defmodule Ash.Test.Dsl.Resource.AttributesTest do
use ExUnit.Case, async: true
defmacrop defposts(do: body) do
quote do
defmodule Post do
use Ash.Resource, name: "posts", type: "post"
unquote(body)
end
end
end
describe "validation" do
test "raises if the attribute name is not an atom" do
assert_raise(
Ash.Error.ResourceDslError,
"Ash.Test.Dsl.Resource.AttributesTest.Post: Attribute name must be an atom, got: 10 at attributes->attribute",
fn ->
defposts do
attributes do
attribute 10, :string
end
end
end
)
end
test "raises if the type is not a known type" do
assert_raise(
Ash.Error.ResourceDslError,
"Ash.Test.Dsl.Resource.AttributesTest.Post: Attribute type must be a built in type or a type module, got: 10 at attributes->attribute",
fn ->
defposts do
attributes do
attribute :foo, 10
end
end
end
)
end
test "raises if you pass an invalid value for `primary_key?`" do
assert_raise(
Ash.Error.ResourceDslError,
"Ash.Test.Dsl.Resource.AttributesTest.Post: option primary_key? at attributes->attribute must be of type :boolean",
fn ->
defposts do
attributes do
attribute :foo, :string, primary_key?: 10
end
end
end
)
end
end
end

View file

0
test/support/.gitkeep Normal file
View file

83
test/type/type_test.exs Normal file
View file

@ -0,0 +1,83 @@
defmodule Ash.Test.Type.TypeTest do
use ExUnit.Case, async: true
defmodule PostTitle do
use Ash.Type
def describe() do
"A post title is less than 10 characters long and is only alphabetic characters and whitespace"
end
def storage_type(), do: :string
def cast_input(value) when is_bitstring(value) do
if String.length(value) <= 10 && String.match?(value, ~r/[a-zA-Z\w]*/) do
{:ok, value}
else
:error
end
end
def cast_input(_), do: :error
def supported_filter_types(_data_layer), do: []
def sortable?(_data_layer), do: false
def cast_stored(value) when is_bitstring(value), do: value
def cast_stored(_), do: :error
def dump_to_native(value) when is_bitstring(value), do: value
def dump_to_native(_), do: :error
end
defmodule Post do
use Ash.Resource, name: "posts", type: "post"
use Ash.DataLayer.Ets, private?: true
attributes do
attribute :title, PostTitle
end
actions do
defaults [:create, :read]
end
end
defmodule Api do
use Ash.Api
resources [Post]
end
test "it accepts valid data" do
post = Api.create!(Post, %{attributes: %{title: "foobar"}})
assert post.title == "foobar"
end
test "it rejects invalid data" do
# As we add informative errors, this test will fail and we will know to test those
# more informative errors.
assert_raise(Ash.Error.FrameworkError, "invalid attributes", fn ->
Api.create!(Post, %{attributes: %{title: "foobarbazbuzbiz"}})
end)
end
test "it rejects filtering on the field if the filter type is not supported" do
# As we add more filter types, we may want to test their multiplicity here
post = Api.create!(Post, %{attributes: %{title: "foobar"}})
assert_raise(Ash.Error.FrameworkError, "Cannot filter :title for equality.", fn ->
Api.read!(Post, %{filter: %{title: post.title}})
end)
end
test "it rejects sorting on the field if sorting is not supported" do
Api.create!(Post, %{attributes: %{title: "foobar1"}})
Api.create!(Post, %{attributes: %{title: "foobar2"}})
assert_raise(Ash.Error.FrameworkError, "Cannot sort on :title", fn ->
Api.read!(Post, %{sort: [asc: :title]})
end)
end
end