论文代写 作业代写转为AI自动运作项目
收集各大兼职网站 QQ接单群 闲鱼的项目描述 人工标注项目报价费用 训练实现自动报价抽成机器人
根据项目描述和大量公开数据 进行开放式问答 训练并应用于论文实现代写方面的付费问答机器人
收集各大兼职网站 QQ接单群 闲鱼的项目描述 人工标注项目报价费用 训练实现自动报价抽成机器人
根据项目描述和大量公开数据 进行开放式问答 训练并应用于论文实现代写方面的付费问答机器人
analyze source code first, then plan attack or fix code
search for darpa cgc
on github
cyber-challenge Some toy examples, to demonstrate ideas that could be used in DARPA’s Cyber Grand Challenge including modifying java bytecode and filter out html requests on the fly
EVIL (Exploiting software VIa natural Language) is an approach to automatically generate software exploits in assembly/Python language from descriptions in natural language. The approach leverages Neural Machine Translation (NMT) techniques and a dataset that we developed for this work.
Topics
linux exploit encoder assembly decoder dataset seq2seq shellcode nmt software-exploitation codebert
Resources
Readme
License
GPL-3.0 license
Stars
13 stars
Watchers
3 watching
Forks
1 fork
Releases
No releases published
Packages
No packages published
Contributors 2
@piliguori
piliguori Pietro Liguori
@taisazero
taisazero Erfan Al-Hossami
Languages
Python
97.6%
Shell
2.0%
Other
0.4%
warning: this thing could break your dependencies. better use docker instead.
1 | docker pull mindsdb/mindsdb |
most useful feature:
training and inferring the hidden states
All models that support labeled data support semi-supervised learning, including naive Bayes classifiers, general Bayes classifiers, and hidden Markov models.
While probability Distributions are frequently used as components of more complex models such as mixtures and hidden Markov models, they can also be used by themselves. Many data science tasks require fitting a distribution to data or generating samples under a distribution. pomegranate has a large library of both univariate and multivariate distributions which can be used with an intuitive interface.
General Mixture Models (GMMs) are an unsupervised probabilistic model composed of multiple distributions (commonly referred to as components) and corresponding weights. This allows you to model more complex distributions corresponding to a singular underlying phenomena. For a full tutorial on what a mixture model is and how to use them, see the above tutorial.
provide tasks and give the best model for given task
i’m afraid paperswithcode.com takes huge amount of time for one to categorize and classify. better get basic task categories out before you need/search it.
aistudio.baidu.com
paperswithcode.com
kaggle.com
autograd and xla (Accelerated Linear Algebra)
With its updated version of Autograd, JAX can automatically differentiate native Python and NumPy functions. It can differentiate through loops, branches, recursion, and closures, and it can take derivatives of derivatives of derivatives. It supports reverse-mode differentiation (a.k.a. backpropagation) via grad as well as forward-mode differentiation, and the two can be composed arbitrarily to any order.
XLA (Accelerated Linear Algebra) is a domain-specific compiler for linear algebra that can accelerate TensorFlow models with potentially no source code changes.
probabilistic programming
pyro implementation in numpy, alpha stage
machine learning in python
install official python bindings:
1 | pip install -U libsvm-official |
third-party python libsvm package installed by:
1 | pip install libsvm |
opennlp uses onnx runtime(maybe?), may support m1 inference.
opennlp is written in java. after installing openjdk on macos with homebrew, run this to ensure openjdk is detected:
1 | sudo ln -sfn $(brew --prefix)/opt/openjdk/libexec/openjdk.jdk /Library/Java/JavaVirtualMachines/openjdk.jdk |
opennlp has a language detector for 103 languages, including chinese. opennlp has a sentence detector (separator) which could be trained on chinese (maybe?)
in order to use opennlp with less code written, here’s how to invoke java from kotlin
found on mannings article about better search engine suggestions. in this example it is used with lucene, which has image retrieval (LIRE) capability. lucene is also avaliable as lucene.net in dotnet/c#.
to install lucene.net:
1 | dotnet add package Lucene.Net --prerelease |
deep learning library for java
gradient boost is used to train decision trees and classification models.
Light Gradient Boosting Machine
have official commandline tools. installation on macos:
1 | brew install lightgbm |
install python package on macos:
1 | brew install cmake |
if want to enable jax sampling, install numpyro
or blackjax
via pip
difference between pymc3 (old) and pymc (pymc4):
pymc is optimized and faster than pymc3
pymc3 use theano as backend while pymc use aesara (forked theano)
docs with live demo of pymc
PyMC is a probabilistic programming library for Python that allows users to build Bayesian models with a simple Python API and fit them using Markov chain Monte Carlo (MCMC) methods.
a high level torch wrapper including “out of the box” support for vision, text, tabular, and collab (collaborative filtering) models.
on the twitter list related to opennlp shown up on its official website, fastai has been spotted.
fastai does not support macos. or is it? fastai is on top of pytorch. initial support starts with 2.7.8 and now it is currently 2.7.9
searching ‘samoyed’ like this in github we get a dataset for pets classification called imagewoof from fastai 2020 tutorial series. more image classes like subcategories of cats may be found in imagenet.