<pre style='margin:0'>
Renee Otten (reneeotten) pushed a commit to branch master
in repository macports-ports.
</pre>
<p><a href="https://github.com/macports/macports-ports/commit/0635d201b364dce576fa2789cc446b4b2233b0ac">https://github.com/macports/macports-ports/commit/0635d201b364dce576fa2789cc446b4b2233b0ac</a></p>
<pre style="white-space: pre; background: #F8F8F8">The following commit(s) were added to refs/heads/master by this push:
<span style='display:block; white-space:pre;color:#404040;'> new 0635d20 py-transformers: new submission, version 2.1.1
</span>0635d20 is described below
<span style='display:block; white-space:pre;color:#808000;'>commit 0635d201b364dce576fa2789cc446b4b2233b0ac
</span>Author: Steve Smith <essandess@users.noreply.github.com>
AuthorDate: Fri Nov 22 22:45:49 2019 -0500
<span style='display:block; white-space:pre;color:#404040;'> py-transformers: new submission, version 2.1.1
</span>---
python/py-transformers/Portfile | 57 +++++++++++++++++++++++++++++++++++++++++
1 file changed, 57 insertions(+)
<span style='display:block; white-space:pre;color:#808080;'>diff --git a/python/py-transformers/Portfile b/python/py-transformers/Portfile
</span>new file mode 100644
<span style='display:block; white-space:pre;color:#808080;'>index 0000000..65a7d7b
</span><span style='display:block; white-space:pre;background:#ffe0e0;'>--- /dev/null
</span><span style='display:block; white-space:pre;background:#e0e0ff;'>+++ b/python/py-transformers/Portfile
</span><span style='display:block; white-space:pre;background:#e0e0e0;'>@@ -0,0 +1,57 @@
</span><span style='display:block; white-space:pre;background:#e0ffe0;'>+# -*- coding: utf-8; mode: tcl; tab-width: 4; indent-tabs-mode: nil; c-basic-offset: 4 -*- vim:fenc=utf-8:ft=tcl:et:sw=4:ts=4:sts=4
</span><span style='display:block; white-space:pre;background:#e0ffe0;'>+
</span><span style='display:block; white-space:pre;background:#e0ffe0;'>+PortSystem 1.0
</span><span style='display:block; white-space:pre;background:#e0ffe0;'>+PortGroup github 1.0
</span><span style='display:block; white-space:pre;background:#e0ffe0;'>+PortGroup python 1.0
</span><span style='display:block; white-space:pre;background:#e0ffe0;'>+
</span><span style='display:block; white-space:pre;background:#e0ffe0;'>+github.setup huggingface transformers 2.1.1 v
</span><span style='display:block; white-space:pre;background:#e0ffe0;'>+revision 0
</span><span style='display:block; white-space:pre;background:#e0ffe0;'>+name py-${github.project}
</span><span style='display:block; white-space:pre;background:#e0ffe0;'>+categories-append textproc
</span><span style='display:block; white-space:pre;background:#e0ffe0;'>+
</span><span style='display:block; white-space:pre;background:#e0ffe0;'>+license Apache-2
</span><span style='display:block; white-space:pre;background:#e0ffe0;'>+maintainers nomaintainer
</span><span style='display:block; white-space:pre;background:#e0ffe0;'>+platforms darwin
</span><span style='display:block; white-space:pre;background:#e0ffe0;'>+supported_archs noarch
</span><span style='display:block; white-space:pre;background:#e0ffe0;'>+
</span><span style='display:block; white-space:pre;background:#e0ffe0;'>+description State-of-the-art Natural Language Processing for\
</span><span style='display:block; white-space:pre;background:#e0ffe0;'>+ TensorFlow 2.0 and PyTorch
</span><span style='display:block; white-space:pre;background:#e0ffe0;'>+long_description \
</span><span style='display:block; white-space:pre;background:#e0ffe0;'>+ 🤗 Transformers (formerly known as pytorch-transformers and\
</span><span style='display:block; white-space:pre;background:#e0ffe0;'>+ pytorch-pretrained-bert) provides state-of-the-art general-purpose\
</span><span style='display:block; white-space:pre;background:#e0ffe0;'>+ architectures (BERT, GPT-2, RoBERTa, XLM, DistilBert, XLNet,\
</span><span style='display:block; white-space:pre;background:#e0ffe0;'>+ CTRL...) for Natural Language Understanding (NLU) and Natural\
</span><span style='display:block; white-space:pre;background:#e0ffe0;'>+ Language Generation (NLG) with over 32+ pretrained models in 100+\
</span><span style='display:block; white-space:pre;background:#e0ffe0;'>+ languages and deep interoperability between TensorFlow 2.0 and\
</span><span style='display:block; white-space:pre;background:#e0ffe0;'>+ PyTorch.
</span><span style='display:block; white-space:pre;background:#e0ffe0;'>+
</span><span style='display:block; white-space:pre;background:#e0ffe0;'>+homepage https://huggingface.co/transformers/
</span><span style='display:block; white-space:pre;background:#e0ffe0;'>+
</span><span style='display:block; white-space:pre;background:#e0ffe0;'>+python.versions 37 38
</span><span style='display:block; white-space:pre;background:#e0ffe0;'>+
</span><span style='display:block; white-space:pre;background:#e0ffe0;'>+checksums rmd160 708151f94aa3a805e2f9c5b845d4d4ae096a5bc5 \
</span><span style='display:block; white-space:pre;background:#e0ffe0;'>+ sha256 48c919a658e6429045b678fddb40eb6401c6be66b4d3a12a7c738da5663d952b \
</span><span style='display:block; white-space:pre;background:#e0ffe0;'>+ size 808428
</span><span style='display:block; white-space:pre;background:#e0ffe0;'>+
</span><span style='display:block; white-space:pre;background:#e0ffe0;'>+if {${name} ne ${subport}} {
</span><span style='display:block; white-space:pre;background:#e0ffe0;'>+ # see https://github.com/huggingface/transformers/blob/master/setup.py
</span><span style='display:block; white-space:pre;background:#e0ffe0;'>+ depends_lib-append \
</span><span style='display:block; white-space:pre;background:#e0ffe0;'>+ port:py${python.version}-setuptools
</span><span style='display:block; white-space:pre;background:#e0ffe0;'>+ depends_run-append \
</span><span style='display:block; white-space:pre;background:#e0ffe0;'>+ port:py${python.version}-boto3 \
</span><span style='display:block; white-space:pre;background:#e0ffe0;'>+ port:py${python.version}-numpy \
</span><span style='display:block; white-space:pre;background:#e0ffe0;'>+ port:py${python.version}-protobuf3 \
</span><span style='display:block; white-space:pre;background:#e0ffe0;'>+ port:py${python.version}-regex \
</span><span style='display:block; white-space:pre;background:#e0ffe0;'>+ port:py${python.version}-requests \
</span><span style='display:block; white-space:pre;background:#e0ffe0;'>+ port:py${python.version}-sacremoses \
</span><span style='display:block; white-space:pre;background:#e0ffe0;'>+ port:py${python.version}-sentencepiece \
</span><span style='display:block; white-space:pre;background:#e0ffe0;'>+ port:py${python.version}-tqdm
</span><span style='display:block; white-space:pre;background:#e0ffe0;'>+
</span><span style='display:block; white-space:pre;background:#e0ffe0;'>+ depends_test-append \
</span><span style='display:block; white-space:pre;background:#e0ffe0;'>+ port:py${python.version}-pytest
</span><span style='display:block; white-space:pre;background:#e0ffe0;'>+
</span><span style='display:block; white-space:pre;background:#e0ffe0;'>+ test.run yes
</span><span style='display:block; white-space:pre;background:#e0ffe0;'>+ test.cmd py.test-${python.branch}
</span><span style='display:block; white-space:pre;background:#e0ffe0;'>+ test.pre_args -sv
</span><span style='display:block; white-space:pre;background:#e0ffe0;'>+ test.target ./transformers/tests/
</span><span style='display:block; white-space:pre;background:#e0ffe0;'>+}
</span></pre><pre style='margin:0'>
</pre>