/agdg/ - Amateur Game Dev General

Bully yourself edition

> Upcoming Demo Day
itch.io/jam/agdg-demo-day-8

> Upcoming jam
itch.io/jam/wj2016

Helpful Links: alloyed.github.io/agdg-links/
New Threads: Archive: boards.fireden.net/vg/search/subject/agdg/

> Chats
steamcommunity.com/groups/vgamedevcrew
webchat.freenode.net/?channels=vidyadev

> Previous Demo Days
pastebin.com/zsDQmN9K

> Previous Jams
pastebin.com/QwcSPdnx

> Models/art/textures/sprites
opengameart.org/
blender-models.com/
mayang.com/textures/

> Free audio
machinimasound.com/
freesound.org/browse/
incompetech.com/music/

Other urls found in this thread:

elevadre.tumblr.com/
twitter.com/NSFWRedditVideo

...

second for
HELP

second for making a cute game

...

me too user

seventh for making a game about force feminization

>tfw to be an indiedev you need to be the jack of all trades
>tfw to work in a gaming company you need to be the master of a single specific field

reminder that level design IS a part of gamedev whether it angers you or not

I have done the progress, lol! :)

Enemies can now shoot at the player. They have no movement prediction yet and simply aim directly at him, but when there is more of then their spray and pray method works actually pretty well and it's quite hard to dodge.

>removing the engine list from the OP
for what purpose?

sabotage

jsyk hard =/= fun

>level design is game dev

doesn't want anyone else to know the secret power of source

reminder to outsmart your enemies

smooth criminal

yes, that's one of the tools you can use to design levels, which is a pillar of game development

This is actually really suitable for a hallucination or nightmare sequence or something

(RECAP MONDAY)

Feel free to reply to this post for the recap to make things easier on me!
Nice work so far everybody, we've already got 15+ games so far! Once again I look forward to seeing all your progress! Good job!

Game:
Dev:
Tools:
Website:
Progress:

Level design is not game dev. Source is not game dev. has to GO!!

i've always thought of making some bullshit game on photoshop and seeing how long i could trick agdg for

Sorry but we have expert nodev analysts that work here

Why won't these guys leave? Do they not get tired of posting the same things every day?

Then how come you guys never realized that my game isn't a game at all

making a fake games would be almost as much work as making a real game.
the time you would save programming would be lost to animating.

...

...

Game: Uplifted
Dev: ElevatorDev
Tools: Java
Website: elevadre.tumblr.com/
Progress:
+New menu stuff
+New enemies, items, events, maps

>guys

it's the same guy, and once you've thought he left he's just shitposting about something else instead

This image makes me very upset to look at. I'm calling the cops.

Damn it
He's ban evading too isn't he, I've seen the posts deleted many times lately

...

the source engine is kinda cool

your normals are inverted bro

First attempt at using a model's normals to modify the shadow map bias and lessen acne.

second option seems much better

WHY IS SOURCE ALLOWED

I'm pretty sure that "Revengedev" was doing that, though I don't see why you'd waste your time just for (you)s

Don't worry, I gotcha buddy. Though thanks anyways.

because it's an engine.. used for developing games?

why do you think, its the same as unity

You're essentially just prototyping the game and if you don't follow through with it you're just wasting your idea.

Just reminding all that the sourceposter forgot to samefag so his stupid image got 0 replies. There may be a lot of shitposts lately but don't forget, the shitposters are only a vocal minority and nothing more. Hide and ignore.

Anybody here who's pretty sharp with OpenGL? I have some questions. I come from /dpt/

It's because of reddit.

I know, but i need fights to be challenging otherwise they will be pointless and boring

How do I stop taking my game so seriously and lose my ego?

It's nodev filth. Why is it allowed? All it's ever done is bring terrorists and nodevs on this thread.

looks like shit

>Java
nodev spotted. Get the fuck out, trash. Your kind are not welcome here!

>tfw I sometimes type "get" instead of "git"
source control is hard.

Looks fun, I dig the unusual setting. Don't listen to the shitters.

ignore this chump, looks cozy mate

>doing unity tutorial
>make an enemy model the player model
>give 2 guns instead of one

I'M DEVVING GUYS

Good. Nice! It's a well-known fact that Unity is the only way to make a game, and you're doing it, man! Keep it up!

Hate to admit but me. Shoot.

what is even going on with your Adventure Time-tier doodles? Man, you BELONG on tumblr.

game clip in this post is okay, doesn't look like anything RPG Maker couldn't do though

gj user, embrace the rush of the dev

What would be another good color to add to pic related?

Fuck, I basically haven't done anything in the meantime.

Just finally got back to work.

>engine dev posts progress
>nobody gives a shit
This is why all the good devs are leaving this place.

Sorry (not sorry) but Unity is the only way to make a game, and rightfully the only thing anyone here cares about. Fuck off, nodev.

It's not even remotely interesting.
He could of at least made it a webm

Have I ever told you the definition of insanity?

I'm thinking about making a program for creating 2d animations, similar to adobe flash. But instead of vector graphics, i want to use bitmaps.

I want it to ideally feel super smooth. 60fps at least. So ideally i want to render everything with OpenGL.

Now the issue is i reckon I'll be using textures much larger than what would fit on many graphics cards. So I'll need to do many passes composite each frame of animation together. I'm wondering how harsh this will be on performance. What I read suggests that downloading textures from GPU memory is slow, so if i need to do this 30 times a frame I'll be in trouble.

Most games stuff i see seems to involve uploading the textures needed in a scene once to the graphics card and swapping them periodically as new sections of the game are loaded. So I'm not sure if i can hope for good performance if i have to swap the GPU textures on each frame multiple times.

Is it "Doing the same thing over and over again expecting the results to be different"? Because you're pretty predictable.

>post progress
>get false flagged by nodevs

take this story one step further
>give character 2 guns
>"hmmmm, 2 guns not enough, what if.... 4 guns TWO-ON-FEET!!!"

and thus, Hideki Kamiya started making Bayonetta

I'm trying to get an instantiated bullet sprite to fire down the same path as a raycast (which is fired at the mouse), both of which are supposed to fire from the same firepoint, which is able to rotate around the player by looking at the mouse.

The old way to do this was:
>transform.Translate (Vector3.right * Time.deltaTime * moveSpeed);

As unity deemed vector3 1,0,0 to be facing the "front" of a firepoint. However, at some point this was changed, so now the raycast will fire towards the mouse point, but the sprite will fire to 90 degrees of the firepoint. Setting the vector3 to 0,0,0 obviously breaks the script, since 0*movespeed = 0.
If it's possible to pass in the firepoint vector 2 x,y to the vector 3, I don't know how to do that.

At the moment, I've given up and ghettorigged a solution by creating a second firepoint for the sprite that is rotated 90 degrees, which allows it to fire down the path correctly.

Is there a way to do this properly in unity or am I fucked?

>muh feelings!
>no bully, muh tech screenshot

show me the GAMES, dog-boobs!

The sketches are placeholder.
As for the engine, I thought a 2d turn-based game would be good practice for a first game.

Sorry, no chance. The newest and coolest PCI-E 4.0 gives you 16GB/s which translates to approx. 270 megs per frame -- that's the hard upper limit for any single-card consumer machine, and nowhere near the typical amount of VRAM nowadays.

That said, you don't ever need that much full-resolution content to fill even a 4k screen so if you do some kind of optimization based on visibility and plain old mipmapping when things are minimized, you'll probably have no trouble in staying under a couple of GBs, quite normal nowadays. You'll still have to stream some of the content, but it'll be one or two textures per frame. Look up megatextures, virtual textures, tiled resources, that kinda shit.

How should the priorities be set for object behavior in GML? The tutorial I'm watching suggests that the most important things should be set at the bottom but it sounds like there's more to it.

looking pretty good already.

Does anyone know any good resources for creating vr content?

...

is it generally bad practice to swap textures a couple of times in a single frame?

I'm having a hard time figuring out the best way to do this. Seems like there's a lot I need to read up on to understand if what i want to do is feasible with OpenGL or not.

I'm considering using Skia, a 2D graphics rendering library that chrome uses and backed by OpenGL. It seems to handle a lot of the problems I need to solve, live trying to batch operations together to limit GPU calls as much as possible. Though I'm really not sure if it will provide performance that I'll be happy with. I'd also ideally like to use 64-bit colour textures which Skia does not support as of yet.

This may be a silly question but.. could you not simply rotate the sprite -90 degrees?

Which engine? Unity has resources on its learn page, UE4 probably does too.

I guess I explained that poorly.

The sprite fires along a vector that is at a right angle to the raycast.

My bad should have mentioned unity since all I have to work with right now is a gear vr.

Depends on the resolution and hardware, but the ideal would be to only load assets to the device once and let them stay there but obviously you can stream up to a point. As I mentioned, you can also have some kind of heuristics and culling (texture not visible -> skip altogether, texture very small -> only upload a small MIP level, etc)

Double precision doubles the amount of data (and slows the rendering, though you seem to expect to be more memory than computation bound), what exactly are you expecting to do for this to make a difference? Most games render at 8 bits per color for the final image and 16-bit floats for intermediate color buffers.

Ah I see, not that the sprite itself is rotated but it has a perpendicular ray.

Interesting.

So you click and you can see a ray being drawn from the origination towards the mouse but a sprite being projected perpendicular to that ray?

Have you tried using simply transform.Translate(Vector3.forward * Time.deltaTime); ?

tell me your game idea and i'll make it happen

Imagine pong but the ball is invisible

undertail

first person low poly/doom style sprites with 3d models dark souls kings field clone

geoffrey leonard simulator

interesting. I'm going to sound like a thicky here...

But can I upload multiple textures at the same time? As in say.. have one texture that has an atlas of loads of shit, and another texture in a different colour format that i use as an off screen render target? Most of my graphics programming experience is with XNA which didn't allow multiple textures too easily, and I at least presumed there was a reason why. I get a bit confused on how this works exactly in OpenGL shit. I could probably do some clever shit to reduce the amount of texture swaps to a minimum.

>16-bit floats for intermediate color buffers.
Oh? For what kind of stuff? HDR related?

cute anime schoolgirl fps

Fist person metroidvania. More leaning towards the vania, but with ranged weapons.

>Have you tried using simply transform.Translate(Vector3.forward * Time.deltaTime); ?

That only works for 3d, since it translates to z.

Oh, GL supports bindless textures (at least via an extension) so you can just have any amount of textures that fit in the memory. Or the traditional 8 or 16 if you use the binds (XNA probably uses some oldish version of DX which forces it to have a similar restriction, not familiar with it tho)

You can start uploading as many as you like so there is no extra overhead, but the lane has a set pace so it'll only send them one at a time and the maximum bandwidth won't increase.

>HDR related?
Exactly, 16 bits is enough to represent the [0,1] range pretty exhaustively and can represent values outside of it. Large numbers become imprecise pretty quickly, but it's not usually an issue (since the order of magnitude is more important than the value itself) -- it's a pretty okay tradeoff between memory usage and quality.

And if you can't require bindless textures to be available, you can fudge it with texture arrays.

True, especially simple if the textures are of equivalent size. Otherwise mipmapping will be a bit troublesome (you just have to do it by hand, not a big deal)

Finally have time to work on my game. Here's some progress. Posting current version in next post.

Put your name back on, googs.

I got it.

So.. my ideas for how I'd render my 2d shit, would be to treat it like a UI scenegraph. So I'll have a tree of elements that need to be rendered to the viewport, and each branch need to be rendered down to a render target texture. Eventually rendering it down to a point where i can composit all the off screen render targets to the actual viewport. Only the branches who's contents have changed, will need to re render their children on a given frame.

So i guess i can upload a texture array of all the graphic elements in one branch, say the head, torso, legs etc of a characters, then upload a empty render target for it to be rendered to. Then when that is done, swap out the texture atlas texture with the next branch that needs to be rendered. I'll keep track of all the render targets and later I'll composite them and render them to the view port.

Does that sound reasonable?

What are bindless textures exactly? This just allows me to have multiple textures on the GPU?

Isn't (1,0,0) a vector3 of length 1 pointing in the positive x-direction? Could you post any code that might be relevant here? This should be pretty simple for sure but I don't really get how you're doing it

In my gun script I do:
>Instantiate (bulletPrefab, firePointA.position, firePointA.rotation);

Then in my bullet script I do:
>transform.Translate (Vector3.right * Time.deltaTime * moveSpeed);

Yeah, that sounds okay. You'll probably get away with using way less resolution than you initally think (I've done an animated sort where I thought we had a shitload of assets but it ended up being ~400mb texture memory and most of the stuff had 4x the resolution we actually required)

Yeah, bindless just allows you to have more textures (and not keep track of any texture indices, it's just like a pointer to GPU memory)