silikongorilla.blogg.se

Crazytalk pipeline demos
Crazytalk pipeline demos






  1. #CRAZYTALK PIPELINE DEMOS HOW TO#
  2. #CRAZYTALK PIPELINE DEMOS SOFTWARE#
  3. #CRAZYTALK PIPELINE DEMOS CODE#

Beidi, Tri, and Atri and team describe pixelated butterfly for hardware efficient sparsity.Sabri and team discuss discovering systematic errors with cross-modal embeddings.

#CRAZYTALK PIPELINE DEMOS CODE#

  • Albert and team released code for our work state space models.
  • We are very grateful that Sasha Rush and Sidd Karamchetti created an Annotated S4 along with a port to JAX!.
  • Oral, Honorable Mention Outstanding Paper
  • Albert and Karan write about modeling long sequences in S4.
  • In ICLR22, we share some results on state space models, domino, and sparsity.
  • Afaik, a first! Outstanding Poster at Efficient ML and Long Talk and Outstanding Paper Runner up at ICML.
  • Tri and Beidi friends show wall-clock improvement over highly optimized versions of BERT and GPT using sparsity.
  • describe Correct-n-Contrast approach which sets SotA on robustness problems.

    #CRAZYTALK PIPELINE DEMOS HOW TO#

    Mayee and Dan discuss how to improve robustness of contrastive learning in Thanos.Karan and Albert use S4 to generate raw audio in Sashimi.In ICML22, we describe audio generation with state spaces, fast learning with sparse matrices, making contrastive learning robust, and a new method for robustness.In UAI22, Dan Mayee and friends talk about how to combine weak supervision and foundation models.Combining with weak supervision techniques, which is amazing!.We continue to look at improving long sequences with state space.It's also great for sparsity! MLPerf Story on Tri! and Best Paper at Hardware Efficient Training. Importantly, it enables the first transformer to solve Path-X variants in long range arena.

    crazytalk pipeline demos

    This implementation is the fastest we're aware of (including highly optimized variants from vendors). Flash Attention looks at attention through the lens of IO-Aware algorithms.We're also really interested in improving the foundations of foundation models.Can foundation models offer perfect secrecy? How does it compare to prior approaches like federated learning? github.Wrangle your data in which we show few shot models -not trained for any of these tasks -get state-of-the-art performance on cleaning, integration, and imputation benchmarks.Domino debugging your data by discovering systematic errors with cross-modal embeddings.

    #CRAZYTALK PIPELINE DEMOS SOFTWARE#

    We've been looking at how foundation models can help us build software systems, most recently:.An explainer of a simplified version of S4 ( S4 Explainer Blog).A panel on Service, Science and Startups changing research.SIGMOD keynote on Data-centric AI, Declarative ML, and Foundation Models in data slides ( YouTube).For the sake of transparency, I do my best to list companies I advise or invest in here, many of which involve former members of the lab. With students and collaborators, I've been fortunate enough to cofound projects including SambaNova, Snorkel, and Factory along with two companies that are now part of Apple, Lattice (DeepDive) and Inductiv (HoloClean). To that end, over a dozen members of our group have started their own professorships. While we're very proud of our research ideas and their impact, the lab's real goal is to help students become professors, entrepreneurs, and researchers. Maybe this sounds like Silicon valley crazy talk, but crazily enough, you've probably used a system that uses these ideas from our lab in the last few hours due to amazing students and collaborations with Google ads, YouTube, Apple, and more. Our work is inspired by the observation that data is central to these systems (crudely, "AI is driven by data-not code"), and so data management principles (re-imagined) play a starring role in our work. Snorkel, Overton ( YouTube), or SambaNova.

    crazytalk pipeline demos

    I'm particularly excited when we can blend ML and systems, e.g.

  • On the systems side, I am broadly interested in how machine learning is changing how we build software and hardware.
  • crazytalk pipeline demos

  • On the machine learning side, I am fascinated by how we can learn from increasingly weak forms of supervision and by the mathematical foundations of such techniques.
  • Our lab works on the foundations of the next generation of machine-learned systems. I'm an associate professor in the Stanford AI Lab ( SAIL) and the Machine Learning Group ( bio).








    Crazytalk pipeline demos