initial commit

This commit is contained in:
Zach Daniel 2022-10-28 10:21:13 -05:00
parent c68a8fa9e9
commit 31e246d305
19 changed files with 907 additions and 211 deletions

View file

@ -1,4 +1,11 @@
# Used by "mix format"
spark_locals_without_parens = []
[
inputs: ["{mix,.formatter}.exs", "{config,lib,test}/**/*.{ex,exs}"]
inputs: ["{mix,.formatter}.exs", "{config,lib,test}/**/*.{ex,exs}"],
import_deps: [:ash, :spark],
locals_without_parens: spark_locals_without_parens,
export: [
locals_without_parens: spark_locals_without_parens
]
]

76
.github/CODE_OF_CONDUCT.md vendored Normal file
View file

@ -0,0 +1,76 @@
# Contributor Covenant Code of Conduct
## Our Pledge
In the interest of fostering an open and welcoming environment, we as
contributors and maintainers pledge to making participation in our project and
our community a harassment-free experience for everyone, regardless of age, body
size, disability, ethnicity, sex characteristics, gender identity and expression,
level of experience, education, socio-economic status, nationality, personal
appearance, race, religion, or sexual identity and orientation.
## Our Standards
Examples of behavior that contributes to creating a positive environment
include:
* Using welcoming and inclusive language
* Being respectful of differing viewpoints and experiences
* Gracefully accepting constructive criticism
* Focusing on what is best for the community
* Showing empathy towards other community members
Examples of unacceptable behavior by participants include:
* The use of sexualized language or imagery and unwelcome sexual attention or
advances
* Trolling, insulting/derogatory comments, and personal or political attacks
* Public or private harassment
* Publishing others' private information, such as a physical or electronic
address, without explicit permission
* Other conduct which could reasonably be considered inappropriate in a
professional setting
## Our Responsibilities
Project maintainers are responsible for clarifying the standards of acceptable
behavior and are expected to take appropriate and fair corrective action in
response to any instances of unacceptable behavior.
Project maintainers have the right and responsibility to remove, edit, or
reject comments, commits, code, wiki edits, issues, and other contributions
that are not aligned to this Code of Conduct, or to ban temporarily or
permanently any contributor for other behaviors that they deem inappropriate,
threatening, offensive, or harmful.
## Scope
This Code of Conduct applies both within project spaces and in public spaces
when an individual is representing the project or its community. Examples of
representing a project or community include using an official project e-mail
address, posting via an official social media account, or acting as an appointed
representative at an online or offline event. Representation of a project may be
further defined and clarified by project maintainers.
## Enforcement
Instances of abusive, harassing, or otherwise unacceptable behavior may be
reported by contacting the project team at zach@zachdaniel.dev. All
complaints will be reviewed and investigated and will result in a response that
is deemed necessary and appropriate to the circumstances. The project team is
obligated to maintain confidentiality with regard to the reporter of an incident.
Further details of specific enforcement policies may be posted separately.
Project maintainers who do not follow or enforce the Code of Conduct in good
faith may face temporary or permanent repercussions as determined by other
members of the project's leadership.
## Attribution
This Code of Conduct is adapted from the [Contributor Covenant][homepage], version 1.4,
available at https://www.contributor-covenant.org/version/1/4/code-of-conduct.html
[homepage]: https://www.contributor-covenant.org
For answers to common questions about this code of conduct, see
https://www.contributor-covenant.org/faq

10
.github/CONTRIBUTING.md vendored Normal file
View file

@ -0,0 +1,10 @@
# Contributing to Ash
* We have a zero tolerance policy for failure to abide by our code of conduct. It is very standard, but please make sure
you have read it.
* Issues may be opened to propose new ideas, to ask questions, or to file bugs.
* Before working on a feature, please talk to the core team/the rest of the community via a proposal. We are
building something that needs to be cohesive and well thought out across all use cases. Our top priority is
supporting real life use cases like yours, but we have to make sure that we do that in a sustainable way. The
best compromise there is to make sure that discussions are centered around the *use case* for a feature, rather
than the propsed feature itself.

27
.github/ISSUE_TEMPLATE/bug_report.md vendored Normal file
View file

@ -0,0 +1,27 @@
---
name: Bug report
about: Create a report to help us improve
title: ''
labels: bug, needs review
assignees: ''
-https://hexdocs.pm/ash_json_api--
**Describe the bug**
A clear and concise description of what the bug is.
**To Reproduce**
A minimal set of resource definitions and calls that can reproduce the bug.
**Expected behavior**
A clear and concise description of what you expected to happen.
** Runtime
- Elixir version
- Erlang version
- OS
- Ash version
- any related extension versions
**Additional context**
Add any other context about the problem here.

36
.github/ISSUE_TEMPLATE/proposal.md vendored Normal file
View file

@ -0,0 +1,36 @@
---
name: Proposal
about: Suggest an idea for this project
title: ''
labels: enhancement, needs review
assignees: ''
---
**Is your feature request related to a problem? Please describe.**
A clear and concise description of what the problem is. Ex. I'm always frustrated when [...]
**Describe the solution you'd like**
A clear and concise description of what you want to happen.
**Describe alternatives you've considered**
A clear and concise description of any alternative solutions or features you've considered.
**Express the feature either with a change to resource syntax, or with a change to the resource interface**
For example
```elixir
attributes do
attribute :foo, :integer, bar: 10 # <- Adding `bar` here would cause <x>
end
```
Or
```elixir
Api.read(:resource, bar: 10) # <- Adding `bar` here would cause <x>
```
**Additional context**
Add any other context or screenshots about the feature request here.

4
.github/PULL_REQUEST_TEMPLATE.md vendored Normal file
View file

@ -0,0 +1,4 @@
### Contributor checklist
- [ ] Bug fixes include regression tests
- [ ] Features include unit/acceptance tests

94
.github/workflows/elixir.yml vendored Normal file
View file

@ -0,0 +1,94 @@
name: Ash CI
on:
push:
branches: [main]
tags-ignore: ["v*"]
pull_request:
branches: [main]
create:
tags: ["v*"]
branches: main
jobs:
build:
runs-on: ubuntu-latest
name: OTP ${{matrix.otp}} / Elixir ${{matrix.elixir}} / Ash ${{matrix.ash}}
strategy:
fail-fast: false
matrix:
otp: ["23"]
elixir: ["1.14.0"]
ash: ["main", "2.2.0"]
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
ASH_VERSION: ${{matrix.ash}}
steps:
- run: sudo apt-get install --yes erlang-dev
- uses: actions/checkout@v2
- uses: erlef/setup-elixir@v1
with:
otp-version: ${{matrix.otp}}
elixir-version: ${{matrix.elixir}}
- uses: actions/cache@v1
id: cache-deps
with:
path: deps
key: otp-${{matrix.otp}}-elixir-${{matrix.elixir}}-deps-2-${{ hashFiles(format('{0}{1}', github.workspace, '/mix.lock')) }}
restore-keys: otp-${{matrix.otp}}-elixir-${{matrix.elixir}}-deps-2-
- uses: actions/cache@v1
id: cache-build
with:
path: _build
key: otp-${{matrix.otp}}-elixir-${{matrix.elixir}}-build-2-${{ hashFiles(format('{0}{1}', github.workspace, '/mix.lock')) }}
restore-keys: otp-${{matrix.otp}}-elixir-${{matrix.elixir}}-build-2-
- run: mix deps.get
- run: mix check --except dialyzer
if: startsWith(github.ref, 'refs/tags/v')
- run: mix check
if: "!startsWith(github.ref, 'refs/tags/v')"
release:
needs: [build]
if: startsWith(github.ref, 'refs/tags/v')
runs-on: ubuntu-latest
name: Release
strategy:
matrix:
otp: ["23"]
elixir: ["1.13.2"]
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
steps:
- run: sudo apt-get install --yes erlang-dev
- uses: actions/checkout@v2
- uses: erlef/setup-elixir@v1
with:
otp-version: ${{matrix.otp}}
elixir-version: ${{matrix.elixir}}
- uses: actions/cache@v1
id: cache-deps
with:
path: deps
key: otp-${{matrix.otp}}-elixir-${{matrix.elixir}}-deps-2-${{ hashFiles(format('{0}{1}', github.workspace, '/mix.lock')) }}
restore-keys: otp-${{matrix.otp}}-elixir-${{matrix.elixir}}-deps-2-
- run: mix deps.get
- run: mix compile
- run: mix hex.publish --yes
if: startsWith(github.ref, 'refs/tags/v')
env:
HEX_API_KEY: ${{ secrets.HEX_API_KEY }}
- uses: little-core-labs/get-git-tag@v3.0.1
id: tagName
- uses: ethomson/send-tweet-action@v1
if: startsWith(github.ref, 'refs/tags/v')
with:
status: |
AshBlog version "${{ steps.tagName.outputs.tag }}" released!
#myelixirstatus
See the changelog for more info:
https://github.com/ash-project/ash_blog/blob/main/CHANGELOG.md
consumer-key: ${{ secrets.TWITTER_CONSUMER_API_KEY }}
consumer-secret: ${{ secrets.TWITTER_CONSUMER_API_SECRET }}
access-token: ${{ secrets.TWITTER_ACCESS_TOKEN }}
access-token-secret: ${{ secrets.TWITTER_ACCESS_TOKEN_SECRET }}

1
FUNDING.yml Normal file
View file

@ -0,0 +1 @@
github: zachdaniel

21
LICENSE Normal file
View file

@ -0,0 +1,21 @@
MIT License
Copyright (c) 2020 Zachary Scott Daniel
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.

View file

@ -2,17 +2,4 @@ defmodule AshBlog do
@moduledoc """
Documentation for `AshBlog`.
"""
@doc """
Hello world.
## Examples
iex> AshBlog.hello()
:world
"""
def hello do
:world
end
end

View file

@ -14,12 +14,46 @@ defmodule AshBlog.DataLayer do
],
links: [],
schema: [
file_namer: [
type: :mfa,
default: {AshBlog.FileNamer, :name_file, []},
doc: """
An MFA that will take a changeset and produce a file name.
The default one looks for a title or name, and appends it to `YYYY/YYYY-MM-DD-\#\{dasherized_name\}.md`.
The date uses the time that the file name was generated record.
"""
],
title_attribute: [
type: :atom,
default: :title,
doc:
"The attribute name to use for the title of the blog post. Will be created if it doesn't exist."
],
created_at_attribute: [
type: :atom,
default: :created_at,
doc:
"The attribute name to use for the created_at timestamp of the blog post. Will be created if it doesn't exist."
],
body_attribute: [
type: :atom,
default: :body,
doc:
"The attribute name to use for the body of the post. Wil be created if it doesn't exist."
],
folder: [
type: :string,
default: "static/blog",
doc: """
A path relative to to the priv directory where the files should be placed.
"""
],
staging_folder: [
type: :string,
default: "static/blog",
doc: """
A path relative to to the priv directory where the files should be placed when they are staged.
"""
]
]
}
@ -43,7 +77,7 @@ defmodule AshBlog.DataLayer do
use Spark.Dsl.Extension,
sections: [@blog],
transformers: [Ash.DataLayer.Transformers.RequirePreCheckWith]
transformers: [AshBlog.DataLayer.Transformers.AddStructure]
alias Ash.Actions.Sort
@ -54,10 +88,8 @@ defmodule AshBlog.DataLayer do
:filter,
:limit,
:sort,
:tenant,
:api,
calculations: [],
aggregates: [],
relationships: %{},
offset: 0
]
@ -70,17 +102,12 @@ defmodule AshBlog.DataLayer do
def can?(_, :composite_primary_key), do: true
def can?(_, :expression_calculation), do: true
def can?(_, :expression_calculation_sort), do: true
def can?(_, :multitenancy), do: true
def can?(_, :upsert), do: true
def can?(_, :aggregate_filter), do: true
def can?(_, :aggregate_sort), do: true
def can?(_, {:aggregate_relationship, _}), do: true
def can?(_, {:filter_relationship, _}), do: true
def can?(_, {:aggregate, :count}), do: true
def can?(_, :create), do: true
def can?(_, :read), do: true
def can?(_, :update), do: true
def can?(_, :destroy), do: true
# Destroy is not implemented yet, because I didn't need it
def can?(_, :destroy), do: false
def can?(_, :sort), do: true
def can?(_, :filter), do: true
def can?(_, :limit), do: true
@ -120,12 +147,6 @@ defmodule AshBlog.DataLayer do
def add_aggregate(query, aggregate, _),
do: {:ok, %{query | aggregates: [aggregate | query.aggregates]}}
@doc false
@impl true
def set_tenant(_resource, query, tenant) do
{:ok, %{query | tenant: tenant}}
end
@doc false
@impl true
def filter(query, filter, _resource) do
@ -184,15 +205,12 @@ defmodule AshBlog.DataLayer do
offset: offset,
limit: limit,
sort: sort,
tenant: tenant,
calculations: calculations,
aggregates: aggregates,
api: api
},
_resource
) do
with {:ok, records} <- get_records(resource, tenant),
{:ok, records} <- do_add_aggregates(records, api, resource, aggregates),
with {:ok, records} <- get_records(resource),
{:ok, records} <-
filter_matches(records, filter, api),
{:ok, records} <-
@ -269,61 +287,57 @@ defmodule AshBlog.DataLayer do
end
end
defp do_add_aggregates(records, _api, _resource, []), do: {:ok, records}
defp get_records(resource) do
published =
resource
|> AshBlog.DataLayer.Info.folder()
|> all_files(resource)
defp do_add_aggregates(records, api, _resource, aggregates) do
# TODO support crossing apis by getting the destination api, and set destination query context.
Enum.reduce_while(records, {:ok, []}, fn record, {:ok, records} ->
aggregates
|> Enum.reduce_while({:ok, record}, fn %{
kind: :count,
relationship_path: relationship_path,
query: query,
authorization_filter: authorization_filter,
name: name,
load: load
},
{:ok, record} ->
query =
if authorization_filter do
Ash.Query.do_filter(query, authorization_filter)
else
query
staged =
resource
|> AshBlog.DataLayer.Info.staging_folder()
|> all_files(resource)
archived =
resource
|> AshBlog.DataLayer.Info.archive_folder()
|> all_files(resource)
[published, staged, archived]
|> Stream.concat()
|> Enum.reduce_while({:ok, []}, fn file, {:ok, results} ->
contents = File.read!(file)
[data, body] =
contents
|> String.split("---", trim: true)
|> Enum.map(&String.trim/1)
case YamlElixir.read_all_from_string(data, one_result: true) do
{:ok, result} ->
attrs =
resource
|> Ash.Resource.Info.attributes()
|> Map.new(fn attr ->
{attr.name, Map.get(result, to_string(attr.name))}
end)
|> Map.put(AshBlog.DataLayer.Info.body_attribute(resource), body)
resource
|> struct(attrs)
|> cast_record(resource)
|> case do
{:ok, record} ->
{:cont, {:ok, [Ash.Resource.put_metadata(record, :ash_blog_file, file) | results]}}
{:error, error} ->
{:error, error}
end
with {:ok, loaded_record} <- api.load(record, relationship_path),
related <- Ash.Filter.Runtime.get_related(loaded_record, relationship_path),
{:ok, filtered} <-
filter_matches(related, query.filter, api) do
{:cont, {:ok, Map.put(record, load || name, Enum.count(filtered))}}
else
other ->
{:halt, other}
end
end)
|> case do
{:ok, record} ->
{:cont, {:ok, [record | records]}}
{:error, error} ->
{:halt, {:error, error}}
end
end)
|> case do
{:ok, records} ->
{:ok, Enum.reverse(records)}
{:error, error} ->
{:error, Ash.Error.to_ash_error(error)}
end
end
defp get_records(resource, tenant) do
with {:ok, table} <- wrap_or_create_table(resource, tenant),
{:ok, record_tuples} <- ETS.Set.to_list(table),
records <- Enum.map(record_tuples, &elem(&1, 1)) do
cast_records(records, resource)
end
end
@doc false
@ -373,16 +387,24 @@ defmodule AshBlog.DataLayer do
|> case do
{:ok, attrs} ->
{:ok,
%{
struct(resource, attrs)
| __meta__: %Ecto.Schema.Metadata{state: :loaded, schema: resource}
}}
Ash.Resource.set_meta(struct(resource, attrs), %Ecto.Schema.Metadata{
state: :loaded,
schema: resource
})}
{:error, error} ->
{:error, error}
end
end
defp expand_path(folder, resource) do
Path.join([priv_dir(resource), folder])
end
defp all_files(folder, resource) do
Path.wildcard(Path.join([expand_path(folder, resource), "**", "*.md"]))
end
defp filter_matches(records, nil, _api), do: {:ok, records}
defp filter_matches(records, filter, api) do
@ -391,66 +413,128 @@ defmodule AshBlog.DataLayer do
@doc false
@impl true
def upsert(resource, changeset, keys) do
keys = keys || Ash.Resource.Info.primary_key(resource)
if Enum.any?(keys, &is_nil(Ash.Changeset.get_attribute(changeset, &1))) do
create(resource, changeset)
else
key_filters =
Enum.map(keys, fn key ->
{key, Ash.Changeset.get_attribute(changeset, key)}
end)
query = Ash.Query.do_filter(resource, and: [key_filters])
def create(resource, changeset) do
file_name = file_name(resource, changeset)
file_path =
resource
|> resource_to_query(changeset.api)
|> Map.put(:filter, query.filter)
|> Map.put(:tenant, changeset.tenant)
|> run_query(resource)
|> case do
{:ok, []} ->
create(resource, changeset)
|> priv_dir()
|> Path.join(folder(resource, Ash.Changeset.get_attribute(changeset, :state)))
|> Path.join(file_name)
{:ok, [result]} ->
to_set = Ash.Changeset.set_on_upsert(changeset, keys)
with {:ok, record} <- Ash.Changeset.apply_attributes(changeset),
record <-
Ash.Resource.set_meta(record, %Ecto.Schema.Metadata{state: :loaded, schema: resource}),
{:ok, yaml} <- yaml_frontmatter(record) do
File.mkdir_p!(Path.dirname(file_path))
changeset =
changeset
|> Map.put(:attributes, %{})
|> Map.put(:data, result)
|> Ash.Changeset.force_change_attributes(to_set)
File.write!(
file_path,
"""
---
#{yaml}
---
#{Map.get(record, AshBlog.DataLayer.Info.body_attribute(resource))}
"""
)
update(resource, changeset)
{:ok, _} ->
{:error, "Multiple records matching keys"}
end
{:ok, Ash.Resource.put_metadata(record, :ash_blog_file, file_path)}
end
end
@doc false
@impl true
def create(resource, changeset) do
pkey =
resource
|> Ash.Resource.Info.primary_key()
|> Enum.into(%{}, fn attr ->
{attr, Ash.Changeset.get_attribute(changeset, attr)}
end)
defp folder(resource, :staged) do
AshBlog.DataLayer.Info.staging_folder(resource)
end
with {:ok, table} <- wrap_or_create_table(resource, changeset.tenant),
{:ok, record} <- Ash.Changeset.apply_attributes(changeset),
record <- unload_relationships(resource, record),
{:ok, record} <-
put_or_insert_new(table, {pkey, record}, resource) do
{:ok, %{record | __meta__: %Ecto.Schema.Metadata{state: :loaded, schema: resource}}}
else
{:error, error} -> {:error, Ash.Error.to_ash_error(error)}
defp folder(resource, :published) do
AshBlog.DataLayer.Info.folder(resource)
end
defp folder(resource, :archived) do
AshBlog.DataLayer.Info.archive_folder(resource)
end
defp yaml_frontmatter(%resource{} = record) do
body_attribute = AshBlog.DataLayer.Info.body_attribute(resource)
resource
|> Ash.Resource.Info.attributes()
|> Enum.reject(&(&1.name == body_attribute))
|> Enum.reduce_while({:ok, []}, fn attr, {:ok, acc} ->
if Ash.Type.storage_type(attr.type) in [
:string,
:integer,
:uuid,
:utc_datetime,
:utc_datetime_usec
] do
case Ash.Type.dump_to_embedded(attr.type, Map.get(record, attr.name), attr.constraints) do
{:ok, value} ->
{:cont, {:ok, [{attr.name, value} | acc]}}
{:error, error} ->
{:halt, {:error, error}}
end
else
{:halt, {:error, "#{inspect(attr.type)} is not yet supported by `AshBlog.DataLayer`"}}
end
end)
|> case do
{:ok, attrs} ->
{:ok,
attrs
|> Enum.reverse()
|> Enum.map_join("\n", fn {name, value} ->
case value do
value when is_binary(value) ->
"#{name}: '#{escape_string(value)}'"
%DateTime{} = value ->
"#{name}: '#{escape_string(value)}'"
other ->
"#{name}: #{other}"
end
end)}
{:error, error} ->
{:error, error}
end
end
defp escape_string(value) do
value
|> to_string()
|> String.replace("'", "\\'")
end
case Code.ensure_compiled(Mix) do
{:module, _} ->
def priv_dir(resource) do
_ = otp_app!(resource)
Path.join(File.cwd!(), "priv")
end
_ ->
def priv_dir(resource) do
:code.priv_dir(otp_app!(resource))
end
end
defp otp_app!(resource) do
Spark.otp_app(resource) ||
raise """
Must configure otp_app for #{inspect(resource)}. For example:
use Ash.Resource, otp_app: :my_app
"""
end
defp file_name(resource, changeset) do
{m, f, a} = AshBlog.DataLayer.Info.file_namer(resource)
apply(m, f, [changeset | a])
end
defp put_or_insert_new(table, {pkey, record}, resource) do
attributes = resource |> Ash.Resource.Info.attributes()
@ -498,46 +582,38 @@ defmodule AshBlog.DataLayer do
end)
end
@doc false
@impl true
def destroy(resource, %{data: record} = changeset) do
do_destroy(resource, record, changeset.tenant)
end
defp do_destroy(resource, record, tenant) do
pkey = Map.take(record, Ash.Resource.Info.primary_key(resource))
with {:ok, table} <- wrap_or_create_table(resource, tenant),
{:ok, _} <- ETS.Set.delete(table, pkey) do
:ok
else
{:error, error} -> {:error, Ash.Error.to_ash_error(error)}
end
end
@doc false
@impl true
def update(resource, changeset) do
pkey = pkey_map(resource, changeset.data)
with {:ok, table} <- wrap_or_create_table(resource, changeset.tenant),
{:ok, record} <- Ash.Changeset.apply_attributes(changeset),
with {:ok, record} <- Ash.Changeset.apply_attributes(changeset),
{:ok, record} <-
do_update(table, {pkey, record}, resource),
do_update(changeset, resource),
{:ok, record} <- cast_record(record, resource) do
new_pkey = pkey_map(resource, record)
file_path =
if folder(resource, record.state) == folder(resource, changeset.data.state) do
changeset.data.__metadata__[:ash_blog_file]
else
new_file_path =
Path.join(
folder(resource, record.state),
Path.basename(changeset.data.__metadata__[:ash_blog_file])
)
|> expand_path(resource)
if new_pkey != pkey do
case destroy(resource, changeset) do
:ok ->
{:ok, %{record | __meta__: %Ecto.Schema.Metadata{state: :loaded, schema: resource}}}
File.mkdir_p!(Path.dirname(new_file_path))
{:error, error} ->
{:error, Ash.Error.to_ash_error(error)}
File.rename!(
changeset.data.__metadata__[:ash_blog_file],
new_file_path
)
new_file_path
end
else
{:ok, %{record | __meta__: %Ecto.Schema.Metadata{state: :loaded, schema: resource}}}
end
{:ok,
record
|> Ash.Resource.put_metadata(:ash_blog_file, changeset.data.__metadata__[:ash_blog_file])
|> Ash.Resource.set_meta(%Ecto.Schema.Metadata{state: :loaded, schema: resource})}
else
{:error, error} ->
{:error, Ash.Error.to_ash_error(error)}
@ -553,66 +629,90 @@ defmodule AshBlog.DataLayer do
end)
end
defp do_update(table, {pkey, record}, resource) do
defp do_update(changeset, resource) do
attributes = resource |> Ash.Resource.Info.attributes()
case dump_to_native(record, attributes) do
{:ok, casted} ->
case ETS.Set.get(table, pkey) do
{:ok, {_key, record}} when is_map(record) ->
case ETS.Set.put(table, {pkey, Map.merge(record, casted)}) do
{:ok, set} ->
{_key, record} = ETS.Set.get!(set, pkey)
{:ok, record}
file_path =
changeset.data.__metadata__[:ash_blog_file] ||
raise "Missing `ash_blog_file` metadata for record, cannot update!"
error ->
error
end
with {:ok, record} <- Ash.Changeset.apply_attributes(changeset),
recore <-
Ash.Resource.set_meta(record, %Ecto.Schema.Metadata{state: :loaded, schema: resource}),
{:ok, yaml} <- yaml_frontmatter(record) do
File.mkdir_p!(Path.dirname(file_path))
{:ok, _} ->
{:error, "Record not found matching: #{inspect(pkey)}"}
File.write!(
file_path,
"""
---
#{yaml}
---
#{Map.get(record, AshBlog.DataLayer.Info.body_attribute(resource))}
"""
)
other ->
other
end
{:error, error} ->
{:error, error}
{:ok, Ash.Resource.put_metadata(record, :ash_blog_file, file_path)}
end
end
@impl true
def transaction(resource, fun, timeout \\ :infinity) do
folder = folder(resource)
def transaction(resource, fun, _timeout) do
tx_identifiers = tx_identifiers(resource)
:global.trans(
{{:csv, folder}, System.unique_integer()},
fn ->
try do
Process.put({:blog_in_transaction, folder}, true)
{:res, fun.()}
catch
{{:csv_rollback, ^folder}, value} ->
all_in_transaction(tx_identifiers, fn ->
try do
fun.()
catch
{{:blog_rollback, rolled_back_tx_identifiers}, value} = thrown ->
if Enum.any?(tx_identifiers, &(&1 in rolled_back_tx_identifiers)) do
{:error, value}
end
else
throw(thrown)
end
end
end)
end
defp all_in_transaction([], fun) do
{:ok, fun.()}
end
defp all_in_transaction([tx_identifier | rest], fun) do
:global.trans(
{{:blog, tx_identifier}, System.unique_integer()},
fn ->
Process.put({:blog_in_transaction, tx_identifier}, true)
all_in_transaction(rest, fun)
end,
[node() | :erlang.nodes()],
timeout
0
)
|> case do
{:res, result} -> {:ok, result}
{:error, error} -> {:error, error}
:aborted -> {:error, "transaction failed"}
result -> result
end
end
@impl true
def rollback(resource, error) do
throw({{:blog_rollback, file(resource)}, error})
throw({{:blog_rollback, tx_identifiers(resource)}, error})
end
@impl true
def in_transaction?(resource) do
Process.get({:blog_in_transaction, file(resource)}, false) == true
resource
|> tx_identifiers()
|> Enum.any?(fn identifier ->
Process.get({:blog_in_transaction, identifier}, false) == true
end)
end
defp tx_identifiers(resource) do
[
AshBlog.DataLayer.Info.folder(resource),
AshBlog.DataLayer.Info.staging_folder(resource),
AshBlog.DataLayer.Info.archive_folder(resource)
]
end
end

View file

@ -0,0 +1,28 @@
defmodule AshBlog.FileNamer do
def name_file(changeset) do
name =
case Ash.Changeset.get_attribute(changeset, :title) ||
Ash.Changeset.get_attribute(changeset, :name) do
nil ->
nil
name ->
name
|> String.replace(~r/[^a-zA-Z0-9 _]/, "")
|> String.replace(~r/[^a-zA-Z0-9]/, "-")
|> String.trim("-")
end
if name do
Calendar.strftime(
DateTime.utc_now(),
Path.join(["%Y", "%Y-%m-%d-#{name}.md"])
)
else
Calendar.strftime(
DateTime.utc_now(),
Path.join(["%Y", "%Y-%m-%d.md"])
)
end
end
end

60
lib/data_layer/info.ex Normal file
View file

@ -0,0 +1,60 @@
defmodule AshBlog.DataLayer.Info do
@moduledoc """
Introspection helpers for the AshBlog data layer.
"""
alias Spark.Dsl.Extension
def folder(resource) do
Extension.get_opt(resource, [:blog], :folder, "static/blog")
end
def staging_folder(resource) do
Extension.get_opt(resource, [:blog], :staging_folder, "blog/staging")
end
def archive_folder(resource) do
Extension.get_opt(resource, [:blog], :archive_folder, "blog/archive")
end
def file_namer(resource) do
Extension.get_opt(resource, [:blog], :file_namer, {AshBlog.FileNamer, :name_file, []})
end
def created_at_attribute(resource) do
Extension.get_opt(resource, [:blog], :created_at_attribute, :created_at)
end
def body_attribute(resource) do
Extension.get_opt(resource, [:blog], :body_attribute, :body)
end
def title_attribute(resource) do
Extension.get_opt(resource, [:blog], :title_attribute, :title)
end
def file_name(%resource{} = record) do
{mod, fun, args} = file_name(resource)
case apply(mod, fun, [record | args]) do
{:ok, value} ->
{:ok, value}
{:error, error} ->
{:error, error}
value ->
raise """
Invalid value returned from file namer `#{inspect(mod)}.#{fun}/#{Enum.count(args) + 1}`.
Expected `{:ok, value}` or `{:error, error}`, got:
#{inspect(value)}
"""
end
end
def full_file_name(resource) do
Path.join([folder(resource), file_name(resource)])
end
end

View file

@ -0,0 +1,41 @@
defmodule AshBlog.DataLayer.Transformers.AddStructure do
use Spark.Dsl.Transformer
alias AshBlog.DataLayer.Info
def transform(dsl_state) do
dsl_state
|> Ash.Resource.Builder.add_new_create_timestamp(Info.created_at_attribute(dsl_state))
|> Ash.Resource.Builder.add_new_attribute(Info.title_attribute(dsl_state), :string,
allow_nil?: false
)
|> Ash.Resource.Builder.add_new_attribute(Info.body_attribute(dsl_state), :string,
allow_nil?: false
)
|> Ash.Resource.Builder.add_new_attribute(:state, :atom,
constraints: [one_of: [:staged, :published, :archived]],
default: :staged
)
|> Ash.Resource.Builder.add_new_action(:update, :publish,
changes: [
Ash.Resource.Builder.build_action_change(
Ash.Resource.Change.Builtins.set_attribute(:state, :published)
)
]
)
|> Ash.Resource.Builder.add_new_action(:update, :stage,
changes: [
Ash.Resource.Builder.build_action_change(
Ash.Resource.Change.Builtins.set_attribute(:state, :staged)
)
]
)
|> Ash.Resource.Builder.add_new_action(:update, :archive,
changes: [
Ash.Resource.Builder.build_action_change(
Ash.Resource.Change.Builtins.set_attribute(:state, :archived)
)
]
)
end
end

116
mix.exs
View file

@ -1,13 +1,101 @@
defmodule AshBlog.MixProject do
use Mix.Project
@description """
A blog data layer for Ash resources.
"""
@version "0.1.0"
def project do
[
app: :ash_blog,
version: "0.1.0",
version: @version,
elixir: "~> 1.14",
start_permanent: Mix.env() == :prod,
deps: deps()
package: package(),
aliases: aliases(),
deps: deps(),
elixirc_paths: elixirc_paths(Mix.env()),
dialyzer: [plt_add_apps: [:ash]],
docs: docs(),
description: @description,
source_url: "https://github.com/ash-project/ash_blog",
homepage_url: "https://github.com/ash-project/ash_blog"
]
end
defp elixirc_paths(:test) do
elixirc_paths(:dev) ++ ["test/support"]
end
defp elixirc_paths(_) do
["lib"]
end
defp extras() do
"documentation/**/*.md"
|> Path.wildcard()
|> Enum.map(fn path ->
title =
path
|> Path.basename(".md")
|> String.split(~r/[-_]/)
|> Enum.map(&String.capitalize/1)
|> Enum.join(" ")
|> case do
"F A Q" ->
"FAQ"
other ->
other
end
{String.to_atom(path),
[
title: title
]}
end)
end
defp groups_for_extras() do
"documentation/*"
|> Path.wildcard()
|> Enum.map(fn folder ->
name =
folder
|> Path.basename()
|> String.split(~r/[-_]/)
|> Enum.map(&String.capitalize/1)
|> Enum.join(" ")
{name, folder |> Path.join("**") |> Path.wildcard()}
end)
end
defp docs do
[
main: "AshBlog",
source_ref: "v#{@version}",
extra_section: "GUIDES",
extras: extras(),
groups_for_extras: groups_for_extras(),
groups_for_modules: [
"Resource DSL": ~r/AshGraphql.Resource/,
"Api DSL": ~r/AshGraphql.Api/
]
]
end
defp package do
[
name: :ash_blog,
licenses: ["MIT"],
files: ~w(lib .formatter.exs mix.exs README* LICENSE*
CHANGELOG* documentation),
links: %{
GitHub: "https://github.com/ash-project/ash_blog"
}
]
end
@ -21,10 +109,30 @@ defmodule AshBlog.MixProject do
# Run "mix help deps" to learn about dependencies.
defp deps do
[
{:spark, "~> 0.1.29"},
{:ash, "~> 2.0"}
{:ash, github: "ash-project/ash"},
# {:ash, path: "../ash"},
{:yaml_elixir, "~> 2.9"},
# dev/test dependencies
{:elixir_sense,
github: "elixir-lsp/elixir_sense", ref: "85d4a87d", only: [:dev, :test, :docs]},
{:ex_doc, "~> 0.22", only: :dev, runtime: false},
{:ex_check, "~> 0.12.0", only: :dev},
{:credo, ">= 0.0.0", only: :dev, runtime: false},
{:dialyxir, ">= 0.0.0", only: :dev, runtime: false},
{:sobelow, ">= 0.0.0", only: :dev, runtime: false},
{:git_ops, "~> 2.0.1", only: :dev},
{:excoveralls, "~> 0.13.0", only: [:dev, :test]}
# {:dep_from_hexpm, "~> 0.3.0"},
# {:dep_from_git, git: "https://github.com/elixir-lang/my_dep.git", tag: "0.1.0"}
]
end
defp aliases do
[
sobelow: "sobelow --skip",
credo: "credo --strict",
"spark.formatter": "spark.formatter --extensions AshCsv.DataLayer"
]
end
end

View file

@ -1,16 +1,46 @@
%{
"ash": {:hex, :ash, "2.2.0", "4fdc0fef5afb3f5045b1ca4e1ccb139b9f703cbc7c21dc645e32ac9582b11f63", [:mix], [{:comparable, "~> 1.0", [hex: :comparable, repo: "hexpm", optional: false]}, {:decimal, "~> 2.0", [hex: :decimal, repo: "hexpm", optional: false]}, {:earmark, "~> 1.4", [hex: :earmark, repo: "hexpm", optional: true]}, {:ecto, "~> 3.7", [hex: :ecto, repo: "hexpm", optional: false]}, {:ets, "~> 0.8.0", [hex: :ets, repo: "hexpm", optional: false]}, {:jason, ">= 1.0.0", [hex: :jason, repo: "hexpm", optional: false]}, {:nimble_options, "~> 0.4.0", [hex: :nimble_options, repo: "hexpm", optional: false]}, {:picosat_elixir, "~> 0.2", [hex: :picosat_elixir, repo: "hexpm", optional: false]}, {:spark, "~> 0.1 and >= 0.1.28", [hex: :spark, repo: "hexpm", optional: false]}, {:stream_data, "~> 0.5.0", [hex: :stream_data, repo: "hexpm", optional: false]}, {:telemetry, "~> 1.1", [hex: :telemetry, repo: "hexpm", optional: false]}], "hexpm", "48eca587e7076fe4f8547e919c0712f081ce85e66c316f6f51dd2535ad046013"},
"ash": {:git, "https://github.com/ash-project/ash.git", "e52d7187d889d3ec3403e7d8bb411f12bed3b103", []},
"bunt": {:hex, :bunt, "0.2.1", "e2d4792f7bc0ced7583ab54922808919518d0e57ee162901a16a1b6664ef3b14", [:mix], [], "hexpm", "a330bfb4245239787b15005e66ae6845c9cd524a288f0d141c148b02603777a5"},
"certifi": {:hex, :certifi, "2.9.0", "6f2a475689dd47f19fb74334859d460a2dc4e3252a3324bd2111b8f0429e7e21", [:rebar3], [], "hexpm", "266da46bdb06d6c6d35fde799bcb28d36d985d424ad7c08b5bb48f5b5cdd4641"},
"comparable": {:hex, :comparable, "1.0.0", "bb669e91cedd14ae9937053e5bcbc3c52bb2f22422611f43b6e38367d94a495f", [:mix], [{:typable, "~> 0.1", [hex: :typable, repo: "hexpm", optional: false]}], "hexpm", "277c11eeb1cd726e7cd41c6c199e7e52fa16ee6830b45ad4cdc62e51f62eb60c"},
"credo": {:hex, :credo, "1.6.7", "323f5734350fd23a456f2688b9430e7d517afb313fbd38671b8a4449798a7854", [:mix], [{:bunt, "~> 0.2.1", [hex: :bunt, repo: "hexpm", optional: false]}, {:file_system, "~> 0.2.8", [hex: :file_system, repo: "hexpm", optional: false]}, {:jason, "~> 1.0", [hex: :jason, repo: "hexpm", optional: false]}], "hexpm", "41e110bfb007f7eda7f897c10bf019ceab9a0b269ce79f015d54b0dcf4fc7dd3"},
"decimal": {:hex, :decimal, "2.0.0", "a78296e617b0f5dd4c6caf57c714431347912ffb1d0842e998e9792b5642d697", [:mix], [], "hexpm", "34666e9c55dea81013e77d9d87370fe6cb6291d1ef32f46a1600230b1d44f577"},
"dialyxir": {:hex, :dialyxir, "1.2.0", "58344b3e87c2e7095304c81a9ae65cb68b613e28340690dfe1a5597fd08dec37", [:mix], [{:erlex, ">= 0.2.6", [hex: :erlex, repo: "hexpm", optional: false]}], "hexpm", "61072136427a851674cab81762be4dbeae7679f85b1272b6d25c3a839aff8463"},
"docsh": {:hex, :docsh, "0.7.2", "f893d5317a0e14269dd7fe79cf95fb6b9ba23513da0480ec6e77c73221cae4f2", [:rebar3], [{:providers, "1.8.1", [hex: :providers, repo: "hexpm", optional: false]}], "hexpm", "4e7db461bb07540d2bc3d366b8513f0197712d0495bb85744f367d3815076134"},
"earmark_parser": {:hex, :earmark_parser, "1.4.29", "149d50dcb3a93d9f3d6f3ecf18c918fb5a2d3c001b5d3305c926cddfbd33355b", [:mix], [], "hexpm", "4902af1b3eb139016aed210888748db8070b8125c2342ce3dcae4f38dcc63503"},
"ecto": {:hex, :ecto, "3.9.1", "67173b1687afeb68ce805ee7420b4261649d5e2deed8fe5550df23bab0bc4396", [:mix], [{:decimal, "~> 1.6 or ~> 2.0", [hex: :decimal, repo: "hexpm", optional: false]}, {:jason, "~> 1.0", [hex: :jason, repo: "hexpm", optional: true]}, {:telemetry, "~> 0.4 or ~> 1.0", [hex: :telemetry, repo: "hexpm", optional: false]}], "hexpm", "c80bb3d736648df790f7f92f81b36c922d9dd3203ca65be4ff01d067f54eb304"},
"elixir_make": {:hex, :elixir_make, "0.6.3", "bc07d53221216838d79e03a8019d0839786703129599e9619f4ab74c8c096eac", [:mix], [], "hexpm", "f5cbd651c5678bcaabdbb7857658ee106b12509cd976c2c2fca99688e1daf716"},
"elixir_sense": {:git, "https://github.com/elixir-lsp/elixir_sense.git", "85d4a87d216678dae30f348270eb90f9ed49ce20", [ref: "85d4a87d"]},
"erlex": {:hex, :erlex, "0.2.6", "c7987d15e899c7a2f34f5420d2a2ea0d659682c06ac607572df55a43753aa12e", [:mix], [], "hexpm", "2ed2e25711feb44d52b17d2780eabf998452f6efda104877a3881c2f8c0c0c75"},
"ets": {:hex, :ets, "0.8.1", "8ff9bcda5682b98493f8878fc9dbd990e48d566cba8cce59f7c2a78130da29ea", [:mix], [], "hexpm", "6be41b50adb5bc5c43626f25ea2d0af1f4a242fb3fad8d53f0c67c20b78915cc"},
"ex_check": {:hex, :ex_check, "0.12.0", "c0e2919ecc06afeaf62c52d64f3d91bd4bc7dd8deaac5f84becb6278888c967a", [:mix], [], "hexpm", "cfafa8ef97c2596d45a1f19b5794cb5c7f700f25d164d3c9f8d7ec17ee67cf42"},
"ex_doc": {:hex, :ex_doc, "0.29.0", "4a1cb903ce746aceef9c1f9ae8a6c12b742a5461e6959b9d3b24d813ffbea146", [:mix], [{:earmark_parser, "~> 1.4.19", [hex: :earmark_parser, repo: "hexpm", optional: false]}, {:makeup_elixir, "~> 0.14", [hex: :makeup_elixir, repo: "hexpm", optional: false]}, {:makeup_erlang, "~> 0.1", [hex: :makeup_erlang, repo: "hexpm", optional: false]}], "hexpm", "f096adb8bbca677d35d278223361c7792d496b3fc0d0224c9d4bc2f651af5db1"},
"excoveralls": {:hex, :excoveralls, "0.13.4", "7b0baee01fe150ef81153e6ffc0fc68214737f54570dc257b3ca4da8e419b812", [:mix], [{:hackney, "~> 1.16", [hex: :hackney, repo: "hexpm", optional: false]}, {:jason, "~> 1.0", [hex: :jason, repo: "hexpm", optional: false]}], "hexpm", "faae00b3eee35cdf0342c10b669a7c91f942728217d2a7c7f644b24d391e6190"},
"file_system": {:hex, :file_system, "0.2.10", "fb082005a9cd1711c05b5248710f8826b02d7d1784e7c3451f9c1231d4fc162d", [:mix], [], "hexpm", "41195edbfb562a593726eda3b3e8b103a309b733ad25f3d642ba49696bf715dc"},
"getopt": {:hex, :getopt, "1.0.1", "c73a9fa687b217f2ff79f68a3b637711bb1936e712b521d8ce466b29cbf7808a", [:rebar3], [], "hexpm", "53e1ab83b9ceb65c9672d3e7a35b8092e9bdc9b3ee80721471a161c10c59959c"},
"git_cli": {:hex, :git_cli, "0.3.0", "a5422f9b95c99483385b976f5d43f7e8233283a47cda13533d7c16131cb14df5", [:mix], [], "hexpm", "78cb952f4c86a41f4d3511f1d3ecb28edb268e3a7df278de2faa1bd4672eaf9b"},
"git_ops": {:hex, :git_ops, "2.0.2", "93ca4b227ea4aa4e927735940c3c4b6411c156dc556bc68fe1ca7fe585010f99", [:mix], [{:git_cli, "~> 0.2", [hex: :git_cli, repo: "hexpm", optional: false]}, {:nimble_parsec, "~> 1.0", [hex: :nimble_parsec, repo: "hexpm", optional: false]}], "hexpm", "d958a9bfc768bce03648e32cf157ecbd1225fc57690dfbef02001ed06b1793ad"},
"hackney": {:hex, :hackney, "1.18.1", "f48bf88f521f2a229fc7bae88cf4f85adc9cd9bcf23b5dc8eb6a1788c662c4f6", [:rebar3], [{:certifi, "~>2.9.0", [hex: :certifi, repo: "hexpm", optional: false]}, {:idna, "~>6.1.0", [hex: :idna, repo: "hexpm", optional: false]}, {:metrics, "~>1.0.0", [hex: :metrics, repo: "hexpm", optional: false]}, {:mimerl, "~>1.1", [hex: :mimerl, repo: "hexpm", optional: false]}, {:parse_trans, "3.3.1", [hex: :parse_trans, repo: "hexpm", optional: false]}, {:ssl_verify_fun, "~>1.1.0", [hex: :ssl_verify_fun, repo: "hexpm", optional: false]}, {:unicode_util_compat, "~>0.7.0", [hex: :unicode_util_compat, repo: "hexpm", optional: false]}], "hexpm", "a4ecdaff44297e9b5894ae499e9a070ea1888c84afdd1fd9b7b2bc384950128e"},
"idna": {:hex, :idna, "6.1.1", "8a63070e9f7d0c62eb9d9fcb360a7de382448200fbbd1b106cc96d3d8099df8d", [:rebar3], [{:unicode_util_compat, "~>0.7.0", [hex: :unicode_util_compat, repo: "hexpm", optional: false]}], "hexpm", "92376eb7894412ed19ac475e4a86f7b413c1b9fbb5bd16dccd57934157944cea"},
"jason": {:hex, :jason, "1.4.0", "e855647bc964a44e2f67df589ccf49105ae039d4179db7f6271dfd3843dc27e6", [:mix], [{:decimal, "~> 1.0 or ~> 2.0", [hex: :decimal, repo: "hexpm", optional: true]}], "hexpm", "79a3791085b2a0f743ca04cec0f7be26443738779d09302e01318f97bdb82121"},
"makeup": {:hex, :makeup, "1.1.0", "6b67c8bc2882a6b6a445859952a602afc1a41c2e08379ca057c0f525366fc3ca", [:mix], [{:nimble_parsec, "~> 1.2.2 or ~> 1.3", [hex: :nimble_parsec, repo: "hexpm", optional: false]}], "hexpm", "0a45ed501f4a8897f580eabf99a2e5234ea3e75a4373c8a52824f6e873be57a6"},
"makeup_elixir": {:hex, :makeup_elixir, "0.16.0", "f8c570a0d33f8039513fbccaf7108c5d750f47d8defd44088371191b76492b0b", [:mix], [{:makeup, "~> 1.0", [hex: :makeup, repo: "hexpm", optional: false]}, {:nimble_parsec, "~> 1.2.3", [hex: :nimble_parsec, repo: "hexpm", optional: false]}], "hexpm", "28b2cbdc13960a46ae9a8858c4bebdec3c9a6d7b4b9e7f4ed1502f8159f338e7"},
"makeup_erlang": {:hex, :makeup_erlang, "0.1.1", "3fcb7f09eb9d98dc4d208f49cc955a34218fc41ff6b84df7c75b3e6e533cc65f", [:mix], [{:makeup, "~> 1.0", [hex: :makeup, repo: "hexpm", optional: false]}], "hexpm", "174d0809e98a4ef0b3309256cbf97101c6ec01c4ab0b23e926a9e17df2077cbb"},
"metrics": {:hex, :metrics, "1.0.1", "25f094dea2cda98213cecc3aeff09e940299d950904393b2a29d191c346a8486", [:rebar3], [], "hexpm", "69b09adddc4f74a40716ae54d140f93beb0fb8978d8636eaded0c31b6f099f16"},
"mimerl": {:hex, :mimerl, "1.2.0", "67e2d3f571088d5cfd3e550c383094b47159f3eee8ffa08e64106cdf5e981be3", [:rebar3], [], "hexpm", "f278585650aa581986264638ebf698f8bb19df297f66ad91b18910dfc6e19323"},
"nimble_options": {:hex, :nimble_options, "0.4.0", "c89babbab52221a24b8d1ff9e7d838be70f0d871be823165c94dd3418eea728f", [:mix], [], "hexpm", "e6701c1af326a11eea9634a3b1c62b475339ace9456c1a23ec3bc9a847bca02d"},
"nimble_parsec": {:hex, :nimble_parsec, "1.2.3", "244836e6e3f1200c7f30cb56733fd808744eca61fd182f731eac4af635cc6d0b", [:mix], [], "hexpm", "c8d789e39b9131acf7b99291e93dae60ab48ef14a7ee9d58c6964f59efb570b0"},
"parse_trans": {:hex, :parse_trans, "3.3.1", "16328ab840cc09919bd10dab29e431da3af9e9e7e7e6f0089dd5a2d2820011d8", [:rebar3], [], "hexpm", "07cd9577885f56362d414e8c4c4e6bdf10d43a8767abb92d24cbe8b24c54888b"},
"picosat_elixir": {:hex, :picosat_elixir, "0.2.2", "1cacfdb4fb0c3ead5e5e9b1e98ac822a777f07eab35e29c3f8fc7086de2bfb36", [:make, :mix], [{:elixir_make, "~> 0.6", [hex: :elixir_make, repo: "hexpm", optional: false]}], "hexpm", "9d0cc569552cca417abea8270a54b71153a63be4b951ff249e94642f1c0f35d1"},
"providers": {:hex, :providers, "1.8.1", "70b4197869514344a8a60e2b2a4ef41ca03def43cfb1712ecf076a0f3c62f083", [:rebar3], [{:getopt, "1.0.1", [hex: :getopt, repo: "hexpm", optional: false]}], "hexpm", "e45745ade9c476a9a469ea0840e418ab19360dc44f01a233304e118a44486ba0"},
"sobelow": {:hex, :sobelow, "0.11.1", "23438964486f8112b41e743bbfd402da3e5b296fdc9eacab29914b79c48916dd", [:mix], [{:jason, "~> 1.0", [hex: :jason, repo: "hexpm", optional: false]}], "hexpm", "9897363a7eff96f4809304a90aad819e2ad5e5d24db547af502885146746a53c"},
"sourceror": {:hex, :sourceror, "0.11.2", "549ce48be666421ac60cfb7f59c8752e0d393baa0b14d06271d3f6a8c1b027ab", [:mix], [], "hexpm", "9ab659118896a36be6eec68ff7b0674cba372fc8e210b1e9dc8cf2b55bb70dfb"},
"spark": {:hex, :spark, "0.1.29", "36f29894fdf8b30aa866a677134654db72807cf02a998aee948a0c5e98a48018", [:mix], [{:nimble_options, "~> 0.4.0", [hex: :nimble_options, repo: "hexpm", optional: false]}, {:sourceror, "~> 0.1", [hex: :sourceror, repo: "hexpm", optional: false]}], "hexpm", "97ed044974cd47e9286d9fa0fd033620bee6b3569bee27e79d1b9bdb4605371e"},
"spark": {:hex, :spark, "0.2.2", "782989111ef63c76ab02779c1f996f0139b644a688a9f08445a33623f4737ff1", [:mix], [{:nimble_options, "~> 0.4.0", [hex: :nimble_options, repo: "hexpm", optional: false]}, {:sourceror, "~> 0.1", [hex: :sourceror, repo: "hexpm", optional: false]}], "hexpm", "998684022e932b18c1512d2d1ac34938eb719df617e982281a50e4a4ea3fdf93"},
"ssl_verify_fun": {:hex, :ssl_verify_fun, "1.1.6", "cf344f5692c82d2cd7554f5ec8fd961548d4fd09e7d22f5b62482e5aeaebd4b0", [:make, :mix, :rebar3], [], "hexpm", "bdb0d2471f453c88ff3908e7686f86f9be327d065cc1ec16fa4540197ea04680"},
"stream_data": {:hex, :stream_data, "0.5.0", "b27641e58941685c75b353577dc602c9d2c12292dd84babf506c2033cd97893e", [:mix], [], "hexpm", "012bd2eec069ada4db3411f9115ccafa38540a3c78c4c0349f151fc761b9e271"},
"telemetry": {:hex, :telemetry, "1.1.0", "a589817034a27eab11144ad24d5c0f9fab1f58173274b1e9bae7074af9cbee51", [:rebar3], [], "hexpm", "b727b2a1f75614774cff2d7565b64d0dfa5bd52ba517f16543e6fc7efcc0df48"},
"typable": {:hex, :typable, "0.3.0", "0431e121d124cd26f312123e313d2689b9a5322b15add65d424c07779eaa3ca1", [:mix], [], "hexpm", "880a0797752da1a4c508ac48f94711e04c86156f498065a83d160eef945858f8"},
"unicode_util_compat": {:hex, :unicode_util_compat, "0.7.0", "bc84380c9ab48177092f43ac89e4dfa2c6d62b40b8bd132b1059ecc7232f9a78", [:rebar3], [], "hexpm", "25eee6d67df61960cf6a794239566599b09e17e668d3700247bc498638152521"},
"yamerl": {:hex, :yamerl, "0.10.0", "4ff81fee2f1f6a46f1700c0d880b24d193ddb74bd14ef42cb0bcf46e81ef2f8e", [:rebar3], [], "hexpm", "346adb2963f1051dc837a2364e4acf6eb7d80097c0f53cbdc3046ec8ec4b4e6e"},
"yaml_elixir": {:hex, :yaml_elixir, "2.9.0", "9a256da867b37b8d2c1ffd5d9de373a4fda77a32a45b452f1708508ba7bbcb53", [:mix], [{:yamerl, "~> 0.10", [hex: :yamerl, repo: "hexpm", optional: false]}], "hexpm", "0cb0e7d4c56f5e99a6253ed1a670ed0e39c13fc45a6da054033928607ac08dfc"},
}

View file

@ -1,8 +1,45 @@
defmodule AshBlogTest do
use ExUnit.Case
doctest AshBlog
test "greets the world" do
assert AshBlog.hello() == :world
alias AshBlog.Test.Post
setup do
on_exit(fn ->
File.rm_rf!("priv/blog")
File.rm_rf!("priv/static/blog")
end)
:ok
end
describe "creating a blog post" do
test "a blog post can be created" do
assert %{title: "first\"", body: "the body"} = Post.create!("first\"", "the body")
end
end
describe "reading blog posts" do
test "blog posts can be listed" do
Post.create!("first\"", "the body")
assert [%{title: "first\"", body: "the body"}] = Post.read!()
end
end
describe "updating blog posts" do
test "blog posts can be published" do
post = Post.create!("first\"", "the body")
assert %{state: :published} = Post.publish!(post)
assert [%{state: :published, title: "first\"", body: "the body"}] = Post.read!()
assert [_] = Path.wildcard("priv/static/blog/**/*.md")
end
test "blog posts can be archived" do
post = Post.create!("first\"", "the body")
assert %{state: :published} = Post.publish!(post)
assert [%{state: :published, title: "first\"", body: "the body"} = post] = Post.read!()
assert [_] = Path.wildcard("priv/static/blog/**/*.md")
assert %{state: :archived} = Post.archive!(post)
assert [_] = Path.wildcard("priv/blog/archive/**/*.md")
end
end
end

7
test/support/api.ex Normal file
View file

@ -0,0 +1,7 @@
defmodule AshBlog.Test.Api do
use Ash.Api
resources do
allow_unregistered? true
end
end

22
test/support/blog/post.ex Normal file
View file

@ -0,0 +1,22 @@
defmodule AshBlog.Test.Post do
use Ash.Resource,
otp_app: :ash_blog,
data_layer: AshBlog.DataLayer
actions do
defaults [:create, :read]
end
attributes do
uuid_primary_key :id
end
code_interface do
define_for AshBlog.Test.Api
define :create, args: [:title, :body]
define :read, action: :read
define :stage, action: :stage
define :publish, action: :publish
define :archive, action: :archive
end
end