More posts you may like
-
r/ElectricUnicycle
For all things electric unicycle (EUC) https://forum.electricunicycle.org https://discord.gg/GJTHZ3r All posts with links, videos or pictures are manually approved to prevent T-Shirt spam.
Members Online upvotes ¡ comments
-
r/RoastMe
Roasting (v.) - To humorously mock or humiliate someone with a well-timed joke, diss or comeback. (As defined by urbandictionary) Hone your roasting skills, meet other roasters, and get yourself roasted! Everybody needs to laugh at themselves! And other people, of course!
Members Online upvotes ¡ comments
-
r/RoastMe
Roasting (v.) - To humorously mock or humiliate someone with a well-timed joke, diss or comeback. (As defined by urbandictionary) Hone your roasting skills, meet other roasters, and get yourself roasted! Everybody needs to laugh at themselves! And other people, of course!
Members Online upvotes ¡ comments
-
upvotes ¡ comments
-
upvotes ¡ comments
-
upvotes ¡ comments
-
r/DragonballLegends
The largest Dragon Ball Legends community in the world! Come here for tips, game news, art, questions, and memes about Dragon Ball Legends!
Members Online upvotes ¡ comments
-
upvotes ¡ comments
-
r/Stargate
All things dedicated to the 1994 Stargate movie and the MGM franchise: SG-1, Atlantis, Universe, Origins, video, RPG games and everything else.
Members Online upvotes ¡ comments
-
upvotes ¡ comments
-
upvotes ¡ comments
-
upvotes ¡ comments
-
r/NBA2k
Reddit's home for anything and everything related to the NBA 2K series. Developer-supported and community-run. Check out our 2K24 Wiki for FAQs, Locker Codes & more. Post not showing up? Let us know in modmail if it's been more than 30 minutes.
Members Online comments
-
upvotes ¡ comments
-
upvotes ¡ comments
-
upvotes ¡ comments
-
r/DragonBallBreakers
The Community dedicated to the new game Dragon Ball: The Breakers! https://discord.gg/Cw3RYAgASd
Members Online upvotes ¡ comments
-
r/3Dmodeling
A place where you can show off your 3D models, artworks, and designs. Anything related to 3D!
Members Online 2upvotes ¡ comments
-
upvotes ¡ comments
-
r/StreetFighter
Home of Street Fighter on reddit, a place to collect Street Fighter content from everywhere on the internet Some discord links too: Capcom discord: https://discord.gg/streetfighter SF6 Resource Hub: https://discord.gg/sf6hub New Challenger: https://discord.gg/newchallenger Online Local: https://discord.gg/pVUHbgX
Members Online upvotes ¡ comments
-
upvotes ¡ comments
-
upvotes ¡ comments
-
upvotes ¡ comments
-
upvotes ¡ comments
Related discussions
Top Posts
TOPICS
Internet Culture (Viral)
Movies & TV
- Action Movies & Series
- Animated Movies & Series
- Comedy Movies & Series
- Crime, Mystery, & Thriller Movies & Series
- Documentary Movies & Series
- Drama Movies & Series
- Fantasy Movies & Series
- Horror Movies & Series
- Movie News & Discussion
- Reality TV
- Romance Movies & Series
- Sci-Fi Movies & Series
- Superhero Movies & Series
- TV News & Discussion
Hey u/adesigne, please respond to this comment with the prompt you used to generate the output in this post. Thanks!
Ignore this comment if your post doesn't have a prompt.
We have a public discord server. There's a free Chatgpt bot, Open Assistant bot (Open-source model), AI image generator bot, Perplexity AI bot, đ¤ GPT-4 bot (Now with Visual capabilities (cloud vision)!) and channel for latest prompts.So why not join us?
Prompt Hackathon and Giveaway đ
PSA: For any Chatgpt-related issues email support@openai.com
I am a bot, and this action was performed automatically. Please contact the moderators of this subreddit if you have any questions or concerns.
I think someone will install this into their Google glasses and have their entire world be anime.
That's so fucking funny and will probably happen within a decade and people will never take them off
People already sleep in VR headsets just to wake up inside vr chat
Wut
i donât think google glasses still exists lol
They do but not in a consumer form
I think they also got cancelled for industrial use as well
What a shame đ
Google glass is still around?
I thought they scraped it publicly and moved it to military/medicine fields because LEO were complaining that it would be used for nefarious purposes years ago.
If it can still be obtained, could you provide a link? Thanks!
What happens if you show it your penis? Will it get censored?
1 blurry pixel.
hahah) look at my article with all plugins and AI tools
â ď¸not enough information to render accuratelyâ ď¸
Fuckkk ouch
*Too much information
Would be the most humiliating 15 minutes of fame ever. Now is your chance to shine and enter the Book of Guinness World Records.
"Least amount of pixels needed to cover up a penis"
I mean why not? They've been running on empty since the 90s lol
Technically single pixels can't be blurry
Underrated roast
The roast was the parent comment. Unless this went over my head, I think this was just a technical clarification that the roast didnât make sense, even though it was funny.
Brutal đĽ
Asking the real questions
does not recognize this tinie))
Would you look at that, all of the words in your comment are in alphabetical order.
I have checked 1,551,529,778 comments, and only 293,693 of them were in alphabetical order.
1.5 billion?!?! Makes me wonder how many comments happen every day
After all, comments do happen here on Reddit relentlessly.
Prolly at least like 20
I hate that this was my immediate question and also the top comment. Fuck, I'm a redditor... Sorry mom..
I think it will bigger it in size
U jus had to go right to the penis didnât u? Beat me to it lol
Benutzername prĂźft aus
Git
Try out and let me know.
Link originally shared by OP.
Alright so who's gonna do this
Funny answer: will mistake it for a tic tac Serious anwser: It looks like stable diffusion so it depends on the model used, most anime Models are trained on nsfw stuff, so you probably would get an anime dick
Average man's first thought haha
Godzilla
Damn now thats impressive, name of ai?
Git
Note: the name if the AI is not git, git(hub) is the tool used to share the AI code
just to prevent future confusion
Edit: (git is a version control software, it lets you back up your code and collaborate with others, github is the platform that hosts your repositories so you don't have to, but you could still host them yourself (although why would you do that))
The Name of our god git should be praised forever
Go on, git!
We shall forsake the old god of Subversion! And hunt down all dissenters who worship the old evil, Visual Source Safe!
I read it as "here's a link to the git repo", not "the name of the AI is Git".
The response to "What's the name" was a linked "Git". Not everyone knows that git is and some people may assume that Git is the name of the AI.
git is a nice ai
Hmm, I think it's like "here's the git link" rather than "the tool is git", plausible deniability between the two anyways
Actually, the tool is GitHub...
The versioning tool is git. The hosting is github.
No, GitHub is the website and web interface. Git is underlying versioning tool.
How does someone actually run something from GitHub? Using this as an exampleâŚ
Would love to know and sorry for the newbie question.
There's no one set of instructions one could give to install and run a project on GitHub, since it hosts code that could be in any language and for any platform. Most large projects meant to be used will contain installation instructions in the README.md file. If not, you are on your own to figure it out.
If they don't have a Releases section, you're gonna have to read the documentation on how to compile it yourself.
And even with available releases, there can (and in this case there are) other dependencies that you have to take care of in order for it to work.
Figure out what its written in and ask chatgpt lol
GitHub just hosts the code. Any code. It could be code that runs in a web browser, a command line, a Windows app, or any other type of program.
Most of the time you'll need to know how to run the code you get from GitHub because different programming languages have different requirements, but luckily this specific app has precompiled the code for you so you can just download it and run it, which you can find here (it's the .7z file, which is like a .zip file, so you'll need a program like 7-Zip to extract it).
If a GitHub repository has precompiled code then you can find it in the
Releases
section, which is in the column to the right of the main repository page.Each one is potentially different. This is an extension for AUTOMATIC1111 so follow those setup instructions first. A1111 is a locally hosted web interface for Stable Diffusion.
No need to be rude
did you make this? It's unreal
Oh, it doesn't have linux support ):
The training is the price you pay for performance here. For a regular neural network, each run is constant-time which is very fast. Neural networks are sort of like crystals to me. There is such a thing as crystalized v fluid intelligence. Neural networks land firmly in the former. I understand that GPT is a transformer, but that just refers to a specific neural network architecture.
TL;DR: neural networks (and transformers such as ChatGPT) require ridiculous amounts of training, but they are very fast because theyâre a form of crystalized intelligence instead of fluid intelligence. This is also why ChatGPT doesnât know anything past 2021 or whenever.
The neural network is, like you said pretrained, so the training isnât impacting the performance. Iâm p sure the reason itâs not real time is bc generative ai are long and deep networks so results take a while. But this will be fixed in the future itâs not intrinsic.
A neural network doesn't inherently require a lot of data/training. That's very much dependent on the amount of parameters/architecture and the complexity of your problem.
Also constant time isn't necessarily fast. A network can take 4 years to output a solution and it would still be constant time. Case and point: this network is too slow to output images in real time.
Right now, the cost is training. If something comes along and makes that a breeze, awesome. And we obviously know the constant isnât large hereâŚ
It doesn't "know" anything at all, it predicts the most likely next word and that has coincidental overlap with truth - a lot of the time.
Your intelligence analogy is both good and bad. It does solve its problem based on what it's been trained on, so can't create outside of that, but people mostly misunderstand the nature of what it has learned and the task it does, so the term will mislead people into thinking there is more of an equivalence to our crystallised intelligence than there is.
What does it mean to "know" something?
I wish people would think about this for more than half a second before they make confident dismissals like the above.
The model doesn't deal with facts or right and wrong. It doesn't really make sense to talk about the model knowing things because it's predicting the next word, meaning and content are emergent properties. All the model does is do a text completion task using plausible words. If you ask it to do 5 + 5 = ? it's not doing the sum, it doesn't know maths, it is completing text string and you've got to hope that its been trained on the right and sufficient data that what it produces happens to reflect reality.
Information is held within the weights and biases which produces answers which overlap with reality because it's been trained that way, but to call it knowledge is going too far because what it's trying to do is simulate text which could have been written by someone with knowledge, not combine the elements of knowledge to formulate an answer.
edit: To answer your question, epistemology has sought to answer what it is to "know" something since forever. If you look at definitions such as "justified true belief" an LLM falls a long way short for meeting the criteria.
Itâs just some automation on top of stable diffusion. But itâs a cool application for sure
Comment removed by moderator
Cartoon filters don't usually change smooth solid objects into rabbits or tissues.
saw some japanese writing on the can for a frame or two as well. wonder if it says anything or just does gibberish like with english
Reminds me of âa scanner darklyâ
And Take on Me's music video!
... is that the name of the band to the kids nowadays?
They're going to have a real A-ha moment when they realize
It's still a music video of Take on Me, therefore it is technically Take on Me's music video.
The effect used there is called rotoscoping.
You sick bastard! Release redbull-chan from her confines now!
Edit: the derpy creature on the pull tab at 0:22 might be even better
Or when the cotton ball briefly turns into a rabbit lol
Comment deleted by user
Iâm not so sure you did.
I find peace in long walks.
of course there is
Drink me horny-san
I'm in need of a name.
Don't know about an anime but there is a game called "Only Cans"
Akikan
Thanks! I watched a brief synopsis and probably worth a quick binge
This AI can do hands at the expenses of everything else. It is the price to pay...
Comment deleted by user
Itâs because of controlnet. This is just stable diffusion, but controlnet fixes the hands.
Comment deleted by user
Yeah I think we're not too far away from seeing some of the current 2D/3D vtubers also having a live-AI animated version of their character/designs. It would really benefit the physical active ones that do dancing/VR stuff with a bunch trackers right now, like filian dancing and doing backflips would probably be a lot better in realtime-AI animation than with the VR trackers.
It's impressive they're able to do stuff like this which is of course not real-time yet but it does keep the same character pretty well.
Comment deleted by user
Not gonna take long at this point to get 12/24 frames
Youâre right this technology will never ever advance. Good work
Removed due to GDPR.
Yep. This is called "tweening" in animation, and there are a million existing tools that do that.
Not sure if they can do it in real-time, though.
How is this related to CGPT?
People here think everything AI related should be posted here.
Thereâs no real other popular place to put stuff like this. I mean thereâs r/singularity but itâs just a massive hive mind there.
r/StableDiffusion
r/StableDiffusion r/artificial r/MediaSynthesis
Unfortunately they are right. Look the amount of upvotes.
I thought he somehow used ChatGPT code to generate cartoon filter. Apparently he didnâtâŚ.
Lol that reflection of an anime girl in one frame đ¤Ł
That made me laugh.
techbros really coping with the definition of "real time" here
I mean it definitely is real time just blink extra long
Also love how anything thats cel shaded is just called "anime" now. And this doesn't even look like cel shading, looks more like a generic cartoon filter, except it's by an AI in real time
The last frame it just inserted an anime face on the can for no reason.
Well OP didn't provide an example with how this animates faces so it looks more like a generic cartoon filter rather than "anime inspired" so the post just gives off "Thing, Japan" vibes
And how is this any different from any of the non-AI cartoon/cel shading/posterization filters out there?
Are these techbros in the room with us right now ? What's "techbro-y" here ?
âThe AI will make You an Anime in Real Timeâ
Comment deleted by user
Which "old style" animation ran with only a few frames above 1fps? 12 or 24 frames was/is standard.
whut? You mean actual books with pictures in it or something?
Even the jankiest flipbook runs at 10+ fps.
All hail the 0.3 fps anime with an ever-changing object.
This needs an AI version of "Take on Me" as background music, complete with garbled speech.
Take on me
(Take on me)
Take me on
(Take on me)
Iâll be gone
INADAYOR HOOOOOOOOOOOOOOOO
DEW DEW DEW DEW DEEEW DEW DEW DEW DEW DEW DEW DEEWWW
For anyone curious who wants the TL;DR: The unique aspect here is that this is made using live video input. Before, you would have to convert videos into an image sequence and batch feed them into Stable Diffusion, then stitch them back into a video.
This is a version of stable diffusion that allows you to input a video source which the AI paints over each frame, or in this case every 10th or so frame. Each image is an individual AI render and has been fine tuned to try and resemble the previous frame and original video. This is called denoising strength. The higher the strength, the more the AI will paint something different. For the 'real time' aspect, this can be achieved by using a fairly low image resolution and having a fairly decent GPU. With a 2060 super I can get a 512x512 image in about 5 seconds.
Anyway fairly impressive that this uses real time video and has a nice shiny A111 extension UI.
10 years from now, augmented reality glasses so you can see the real world permanently anime
TIL âDenoisingâ
Yeah, I was curious on the hardware used for this video.
I'm guessing a 4090.
My 1060 6GB can do four 512x512 images in about 35 seconds (about 9 seconds per image).
Super neat though. With some interpolation (possibly this Google Research one I just found via ChatGPT), it wouldn't be too bad to dump a video in and have it process in the background.
I doubt my 1060 could get close to anything resembling "real-time", but it's wild how far we've come in only a few months.
Appreciate the explanation +1
Ok that can immensely help with animation process, it's like when games use mocap to animate 3d models, here you will be able to just play out some scenes irl and have ready key frames you can start from.
Quick, someone take a potato chip... AND EAT IT!!!
That's not anime
it's stop motion anime Xd
Manga perhaps. Just need some kana for sound effects.
00:29 An anime girl appearing out of nowhere
Should you provide a film.
run it trough the whole of the start wars movies, please please please pleeeeease :) :) :)
This would pop off as a mobile app. I can already see the social media posts
I find this impressive despite the current frame rate. 0.5fps soon becomes 5fps and then 30fps.
Comment deleted by user
I can imagine a fully AI generated film in the future. Most likely edited, but AI generated.
That is not real time
Sometimes it will even draw Squirtle on your desk, as a bonus I guess.
I could see this speeding up storyboards or comic/manga panel layouts
Let's take this to p0rnhub and make some hentai đ
AI will make anime great again.
taaaaaaaaake oooooooooooooon meeeeeeeeee
Holy shit. Game over, man. Game over.
The frightened woman at the end is what makes it genuine anime
Kinda reminds me of that movie "a scanner darkly"
Which ai is this?
I keep seeing these amazing videos but how to do it? Step by step guide for noobs?
What AI is this?
How do you make this?
Itâs impressive, but a more accurate title would be ânear instantâ rather than real time. Thereâs a solid second between each frame.
holy Jesus, thats insnane
Iâd prefer one you can just upload a clip to so that the output isnât stop-motion
Obviously wouldn't be live but if you split the a source video into frames and put those into EbSynth with the anime keyframes you'll fill in the gaps.
We need the prompt and the AI , if you agree push âŹď¸ plz
That's not real time AI. That's an occasional frame with a generic cartoon filter applied.
Generic cartoon filter that turns objects into completely different looking objects?
It's amazing how something 100% wrong gets upvoted like that
It turns an AirPods case into various fluffy objects, including a bunny and a sweat band. Your comment is just as inaccurate and yet you also get upvotes. You should probably complain about this.
Why would I complain again if my comment is still accurate, even when it targets me lol
It looks so bad. Most of the time it looks like a cheap ancient photoshop filter, and then randomly hallucinates things for single frames after. This is not worth sharing.
I think the meaning is that it doesn't look particularly different from what could be achieved with a cartoon filter.
Difference is that it is live. Simple cartoon filter cannot do this.
One frame per second is not exactly live
Watch it closely, it's interpreting and hallucinating details. At 29 seconds it sees an entire girl in the reflection of the red bull, or at 35 seconds it has completely rearranged the background. The arm in that shot is also built and posed in a different way.
That certainly isnât a filter. Whether or not it is real-time is impossible to prove from this video, but it appears to be real-time.
He links the GitHub page. Not quite real-time, but it is AI
It's definitely AI, but it's not in real time.
2 Fps anime by the look of it.
Hentai cough hentai
This has the potential to change everything.
Imagine just doing it for a movie like starwars and having the AI reanimate the entire thing.
Not to mention any youtube video. Or shit just animating your day on a go pro.
Learn a little post editing skills and you can make attack on Titan in your back yard.
What app is this
IT DOES HANDS!
starts making hentai
The AI is seeing cute little fluffy things
cool, but not real time đŠ
It is real time, just at a low frame rate.
Wait is it rendering it in real time or is it just a cartoon filter? Cause the later would be extremely easy compared to the former
Did you watch the full video? There are parts where the AI confuses one thing for another so I'm guessing it's processing in real time
Still not there yet
"Make you an anime in real time" as it does 1 frame a second and repeatedly gets objects wrong.
This is just a slightly more advanced Instagram filter. AI has a place in animation, put pretending it's at state where animators can be replaced will get you some Ex-Arm looking shit.
Excuse me sir you dropped your tin foil hat.
Why does this have to be AI? There have been anime video filters for years. This is just a worse implementation judging by the render speed
i would call it filter instead of AI đ¤ˇââď¸
This isn't real time, but if I take it at face value it is close
No excuses now for crappy animation on anime
This is amazing, thanks for sharing
So itâs just rototscope filters in real-time. Itâs not AI.
Why is everyone so impressed with this shit when it reinterprets everything every frame inconsistently
And why are you butthurt over what other people like?
I hate this.