Progress

Re: Progress

Postby hc » 18 Nov 2012, 12:24

Yesterday I finally got the time and motivation to implement collisions with trees. No more walking through trees that easy but you can still walk through flowers and bushes. Before implementing I had to clean up the landscape and plant generation and rendering somewhat to make the essential parts reusable. And there is still some dirt: The rendering and collision of trees still depends on a sequence of random numbers drawn for other foliage types.

Talking about dirt: Another small feature I've added is small environmental dust or pollution particles floating around you. I think that small feature may enhance immersion and depth perception as you can see in many recent 3d movies.
github.com/hackcraft-de
hackcraft.de
User avatar
hc
LW3D Moderator
 
Posts: 213
Joined: 07 Feb 2011, 10:00
Location: far away

Re: Progress

Postby hc » 14 Jan 2013, 23:59

Another small feature that's on it's way - that's going to have some impact - is file/resource loading relative to some given data-root-directories.

Taking the order of these directories into account allows to have new resources override old ones.
That means that while normally all files - including configuration files - are loaded from a default directory you will be free to add extension, mod and mission directories which may change individual aspects of the simulation.

This will ultimately ease individual modding and asset production.
As a side effect packaging for OS distributions may become simpler because the data directories' location may be changed or could be included as a fallback path.

Stand by for updates...
github.com/hackcraft-de
hackcraft.de
User avatar
hc
LW3D Moderator
 
Posts: 213
Joined: 07 Feb 2011, 10:00
Location: far away

Re: Progress

Postby hc » 16 Jan 2013, 20:51

The change for loading-data-file-from-preferred-data-package-directories is now available in the public git repository.

By default data files are now sought in the following reverse order (last to first):

4. "" or absolute path (this will only be useful during development to load files from anywhere in your filesystem using an absolute path).
3. "data" directory relative to the current work directory (the linwarrior binary is called out-of-the-box from the project-root-directory via script file). (fallback/default)
2. "ext" directory relative to the current work directory which doesn't exist by default and is thought for extensions.
1. "mod" directory relative to the current work directory which doesn't exist by default and is thought for mods (based on default data + extension). (override)

By giving the -p or --path flag followed by a directory you can add more override-paths, the last one will be the first searched for files.
Imporant: The directory-path has to end without trailing slash but may start with a leading slash (using no leading slash would be relative to the current directory of course).
Example-Script:
{l Code}: {l Select All Code}
./dist/linwarrior game [....other flags] -p /home/myusername/mylinwarriordata


Use case I: Assume you would like to customize the global.properties configuration file just for your own experience (leaving the project dir untouched). Besides environment conditions this would enable you to use a different vehicle.

Solution I: Create /home/myusername/mylinwarriordata/base directories and copy the global.properties from the linwarrior/data/base directory. Add to the linwarrior command-line/script:
{l Code}: {l Select All Code}
-p /home/myusername/mylinwarriordata


Use case II: Assume you would like to replace (override only actually!) some default textures with some of your own (or sourced from OGA), say a ground texture.

Solution II: Like Solution I, create the base/landscape/ground directories within your /home/myusername/mylinwarriordata directory and create a (square) ground_stones.tga file. When exporting with gimp make sure that RLE-Compression is DISABLED - a zipped tga is smaller anyway and linwarrior can only load Non-RLE-tga files. Make sure that your active global.properties file contains:
{l Code}: {l Select All Code}
landscape.ground=stones


And feel free to post shots if you try anything. :cool:

Happy Modding ! :twisted:
github.com/hackcraft-de
hackcraft.de
User avatar
hc
LW3D Moderator
 
Posts: 213
Joined: 07 Feb 2011, 10:00
Location: far away

Re: Progress

Postby hc » 29 Jan 2013, 20:02

Now the player model can be set in the global.properties file, the new thing is that beside some pre-compiled model names you can provide a path to a properly rigged md5mesh file directly.

The model can be either in any of the searched directories - as described in the previous post - or in any global directory.

In /base/global.properties, using a pre-compiled model name which is resolved to a fixed filename:
{l Code}: {l Select All Code}
mission.player.mech=frogger


In /base/global.properties, using a filename inside one of the search pathes:
{l Code}: {l Select All Code}
mission.player.mech=/base/model/wanzer/frogger/frogger.md5mesh


In /base/global.properties, using a filename inside outside of the search pathes:
{l Code}: {l Select All Code}
mission.player.mech=/home/myuser/mymodels/frogger.md5mesh


Note that the model has to be modeled and rigged in MisfitModel 3D using the right rig - which can be borrowed from provided models (frogger.mm3d for frogger.md5mesh). And then the model has to be exported using the MisfitModel 3D to md5mesh exporter (source provided in the linwarrior/tools directory).
There is no support for other exporters as for now (ie. not tested).
github.com/hackcraft-de
hackcraft.de
User avatar
hc
LW3D Moderator
 
Posts: 213
Joined: 07 Feb 2011, 10:00
Location: far away

Re: Progress

Postby hc » 31 Jan 2013, 18:57

Talking about pathes...good news for prospecting distribution packagers.

In a recent commit the /usr/share/games/linwarrior/ path was added to the fileloader.
When installing linwarrior the only thing necessary as for the data is to copy the data folder to that folder.
No more patching necessary as for that matter.

Yes, including the data folder itself - so that there will be ..../games/linwarrior/data/....

The mentioned linwarrior/ext and linwarrior/mod directories are in that search path, too, for future extension and modding.

Btw Linwarrior and its code'n'build framework are moderatly licensed under apache license.
Assets not under the same license are under CC0/CC-BY/-SA.
github.com/hackcraft-de
hackcraft.de
User avatar
hc
LW3D Moderator
 
Posts: 213
Joined: 07 Feb 2011, 10:00
Location: far away

Re: Progress

Postby hc » 31 Jan 2013, 22:58

Another quick update: The Linwarrior homepage got a small overhaul.

The Media- and Download-Sections are now separated and the new media page starts its being with several embedded videos.
Lots of qubodup's videos and a video by skorpio.

Both pages will probably need some more updates in the future, adding images etc.
Maybe the history page will disappear and be reborn as a part of the media page.

So, ...
[waving two fingers slowly] me: You want to take a look at http://hackcraft.de/games/linwarrior_3d ... NOW!
github.com/hackcraft-de
hackcraft.de
User avatar
hc
LW3D Moderator
 
Posts: 213
Joined: 07 Feb 2011, 10:00
Location: far away

Re: Progress

Postby hc » 10 Mar 2013, 20:51

[Guru Meditation Ahead]

Recently I've had a sudden inspiration.

For some time I've tried to stick everything into the World-System, including game logic.
Now, after seeing the GameMain more as a controller (after some refactoring) it became obvious that the Mission-System shouldn't be a subsystem of the "World" but a part of the controller that is layered atop the "World" - so I moved it.

Along with that inspiration came: The "World" is some kind of physical world simulation which is based uppon uniformly applied rules and calculations which forms a flow. The "Game" is the artificically superimposed Game-Logic with heterogenous rules and interactions on the "World" that disrupt the flow.

Like Nature- and Business-Laws.
Like Flow and Jump.

Of course the World-System simulates more than just physics but it remains that it's more a flow like simulation whereas the Game-Logic uses and alters that flow through interactions from the outside.

So far, no harm to try to stick a lot of game-like logic and interaction into the "World-flow-simulation" but there are things that need to "stick out" and should become part of the next higher layer - which is the Game-Controller.

Other than these considerations little actual code has changed which would require further changes - it was merely a paradigm shift and a code-Re-Move and delegation of responsibility yet.

Beside that I was able to simplify some more Code in the Game-Main-Code related to threading - but that's another story...
github.com/hackcraft-de
hackcraft.de
User avatar
hc
LW3D Moderator
 
Posts: 213
Joined: 07 Feb 2011, 10:00
Location: far away

Re: Progress

Postby hc » 24 Jun 2013, 21:38

Just a small keep-alive. ;)
github.com/hackcraft-de
hackcraft.de
User avatar
hc
LW3D Moderator
 
Posts: 213
Joined: 07 Feb 2011, 10:00
Location: far away

Re: Progress

Postby Julius » 24 Jun 2013, 22:16

We demand real progress though :p
User avatar
Julius
Community Moderator
 
Posts: 3297
Joined: 06 Dec 2009, 14:02

Re: Progress

Postby hc » 30 Jun 2013, 15:46

Thank you for your interest and your critical demands.

Well, I was just now adding support for stereoscopic 3d.
With support for Head Mounted Displays hat use fish-eye lenses.

I managed to get one part of the work done last year already without revealing much of its use:
http://forum.freegamedev.net/viewtopic.php?f=50&t=1313&start=25#p36777
Which was actually just the monoscopic projection to both eyes - at that time without hardware to test and adjust the values.

Later this year I just got around to tweak the values for the monoscopic-stereo rendering (screen-eye distance, warping factors).

And finally this weekend I got around to add the major other half.
The other half contained to actually render two separate images for the eyes into different framebuffers and merging them afterwards with the fish-eye projection.

A sideeffect of the port is that the postprocessing shaders are enabled all the time at the moment which changes the looks and may or may not look nice.

The current source is in the git repository.
To start just run the provided linwarrior-hmd script.
Within the script you will notice the new command line parameter -c or --cyber followed by the virtual eye separation distance (in meters?).
You may adjust this value to enhance or lessen the 3d effect - or rather to adjust it for your comfort.

Due to lack of drivers (and possible bugs in the generic HID drivers) and lack of information about the protocol (which wouldn't be too hard given a working input stream) there is no tracking support. I have not registered for a developer account (yet) because of the soul selling contract and no linux support anyway. I just need a working hardware interface nothing fancy...

I fear this situation may again lead to a pile of non-standardized hardware with tons of unknown interfaces.
Gamepad button and axis layout anyone?

We need an Open Game Devices Board to unify gaming interfaces and thus enable open gaming on the pc.
And with open gaming I do not only mean freeware and open source.
A Manager of a well known Company stated that their competition isn't just the console market but the whole Entertainment Industry without boundaries.
Likewise the principal open platform - the PC - has to compete with consoles.

While the PC should not become a closed console the PC world would do good to be likewise consumer friendly and that is end-user AND developer friendly:
Give developers what they need, unify what should be unified, make customers happy with whatever hardware they buy.
Nothing big, I just envision to be able to go to a store, buy a gamepad, plug it in, start say super-tux cart and have a gamepad layout that works out of the box - or choose one from a list of options.
At this moment it is not even possible as a developer to know a gamepad layout of a even well known brand without buying and testing - zero developer documentation.
Why do those things serve the same purpose, look the same and yet randomize their button and axis layout?
Granted, pc game pads derive from different brands of console game pads produced by third party companies to fit those consoles but that only counts for those that resemble different console game pads not those that look like created equal. Oh and then there are game pads that derive from pc joysticks plus added throttle and rudder...

:eew:

if I keep on writing we can make a nice little 3 pages freegamer rant ;)

[Edit Tag: Open Source Mech Game Oculus Rift]
Attachments
LinWarrior-stereoscopic-2013-06-30.png
github.com/hackcraft-de
hackcraft.de
User avatar
hc
LW3D Moderator
 
Posts: 213
Joined: 07 Feb 2011, 10:00
Location: far away

Re: Progress

Postby Julius » 30 Jun 2013, 16:45

Ah, cool VR support. Definitely a great feature for a mech game.

Regarding gamepads... well since relatively recent the Xbox360 compatible gamepad is a sort of defacto standard under Windows most game support well. I have been using a 3rd party USB PS3 pad using Xbox360 "emulation" quite successfully for most games without ever modifying the configs. Under Linux there is also a Xbox360 driver (called "xboxdrv") that seems to work well also with that PS3 pad and the few games I tired (and the Steam big-picture mode) worked more or less fine also.
User avatar
Julius
Community Moderator
 
Posts: 3297
Joined: 06 Dec 2009, 14:02

Re: Progress

Postby dusted » 04 Jul 2013, 13:43

Awesome job getting occulus support in there! I definitely want to get those at some point! Just feeling too poor atm =(
How was implementing it?
It may be cool with a short post about your experience of adding VR support to the project :D
User avatar
dusted
 
Posts: 83
Joined: 27 Feb 2010, 04:35

Re: Progress

Postby hc » 05 Jul 2013, 17:27

Hi, I'll respond in lengh when i get to it.

You can get the consumer version which will make the dev version obsolete anyway
- but could make it a precious piece of history.
Btw they announced linux support just yesterday or so to come eventually.
But I'd still prefer a low level interface aka driver without the legal fuzz for IP.
github.com/hackcraft-de
hackcraft.de
User avatar
hc
LW3D Moderator
 
Posts: 213
Joined: 07 Feb 2011, 10:00
Location: far away

Re: Progress

Postby Skorpio » 05 Jul 2013, 19:29

I'm just looking at the stereoscopic screenshot with the cross-eyed view. It looks nice, the effect could be a bit stronger, though. I wonder how it looks with the oculus. Also, you really need high res textures. :p
User avatar
Skorpio
OD Moderator
 
Posts: 775
Joined: 05 Dec 2009, 18:28

Re: Progress

Postby hc » 05 Jul 2013, 20:36

Oh, yeah, I wanted to mention the cross-eyed method, it's nice for a sneak preview.

Well, as for the effect and it's strength:
As mentioned the "stereo-separation" or distance of your virtual eyes to each other can be adjusted by a variable - I like to think of usual sharks and hammerhead sharks.
You may adjust it but since our eyes do have a (personal) fixed disance from each other an exaggerated change could basically be perceived as a shrinked or enlarged world (or yourself) and may even feel uncomfortable.

Another variable (fixed right now) btw. is the "real" eye separation which determines where on the physical screen the image center is (think of reticle) - for grown ups it's similar but can vary within some mm range. Going into more detail it may even vary nose-center to left and nose-center to right (like owl's ears). Much bubbling, the point is: You'll notice that the left and right images are shifted towards the center of the whole image and (sadly) there is some asymmetric cut-off.
This cut-off could only be removed by shrinking the L/R images - which now finally leads to "how it looks with the oculus".

A very big part of the impression with an HMD (head mounted display) is really the FOV (Field of View) which gives you *immersion*.
Even without the stereoscopic rendering it is just a great dive-into - the stereoscopic adds some more realism or touchability but the immersion mostly comes from the FOV and the mountedness.
These last two points can really lead to a strong reaction, which I like to diagnose as a kind of motion sickness (nausea) as a part of a Space Adaptation Syndrom:
http://en.wikipedia.org/wiki/Space_adaptation_syndrome
Caused by having a totally different visual perception from what your inner-ear tells you.
Therefore it's best to not move your head at all - at least when there is no tracker linked to the simulation to give you the right visuals for your movement.

Hm I haven't really played with the usual Rendering-FOV-Degrees yet, should do that to match them up.

I'll delve into the technical rendering issues later.

As for the textures:
I do have all these fototextures in higher resolution (that's a principle*) but didn't put them in higher res for some reasons but I may replace them with somewhat better eventually.

[EDIT: principle*: Create data in increddible resolution and just downscale to what you need when you need it]
github.com/hackcraft-de
hackcraft.de
User avatar
hc
LW3D Moderator
 
Posts: 213
Joined: 07 Feb 2011, 10:00
Location: far away

Re: Progress

Postby Julius » 05 Jul 2013, 22:02

Hmm, I think the Valve guys somewhere mentioned that they found in their VR experiments that a HUD GUI like yours really doesn't work with VR googles. So I guess you will have to switch to a real cockpit view for that (which would IMHO be nice for a your mech game in general, too ;) ).
User avatar
Julius
Community Moderator
 
Posts: 3297
Joined: 06 Dec 2009, 14:02

Re: Progress

Postby charlie » 05 Jul 2013, 23:38

I think one of the reasons it causes motion sickness is that although you get depth percetion, your lense has to be focused very near. Looking far, focusing near => brain dump.
Free Gamer - it's the dogz
Vexi - web UI platform
User avatar
charlie
Global Moderator
 
Posts: 2131
Joined: 02 Dec 2009, 11:56
Location: Manchester, UK

Re: Progress

Postby hc » 05 Jul 2013, 23:55

It's true that the hud as it is doesn't really 'work'.
You can not get everything you want when you need or want it.

The actual problem is what's visible through the lenses is a circular part of the image.
And as can be seen there is cut-off because of eye centering the images as described before.
Scaling the hud down towards the center would plainly work, does it feel good and does it fit etc are other questions.
The hud is currently virtually at infinite distance but focussed - as for technical matters.

I like to switch the hud off using an HMD for pure immersion, I've had added the "h" key for toggling but forgot to mention.
Not that it would add to playability to switch it off...
Yes, for immersion an eye projected hud feels odd...today....wait until people get used to google glasses.
It's kinda cyberpunk to have such a hud - feels cyber - can't compare to fps on a flatscreen

To me currently it feels more like a remote operated drone when having the hud, without it is me moving through the world.

Again, there is no tracking which is another game changer.
I may go back to 3d instrument panels perhaps - now having renderbuffers it's technically possible.
But, funny, actually, real instruments are soo retro but mechs are fiction superceded by science soon anyway, aren't they ? ;).
github.com/hackcraft-de
hackcraft.de
User avatar
hc
LW3D Moderator
 
Posts: 213
Joined: 07 Feb 2011, 10:00
Location: far away

Re: Progress

Postby hc » 06 Jul 2013, 00:07

@Charlie: oh contraire, your eyes are relaxed focusing far, the lenses do the trick.
And you get motion sickness from plain 2d when you turn your avatar fast.
It really is the perceived optical rotational accelleration (vs your other senses).
github.com/hackcraft-de
hackcraft.de
User avatar
hc
LW3D Moderator
 
Posts: 213
Joined: 07 Feb 2011, 10:00
Location: far away

Re: Progress

Postby hc » 06 Jul 2013, 09:53

Now finally a word or two about the implementation.

First the oculus rift behaves as an ordinary display because it is one.
Therefore it appears as a secondary display which can enhance your desktop.
Because of the resolution I was not able to put it into mirroring mode like a beamer.
So for me so far I was only able to render to a window (not fullscreen) and
I have to use the mouse to drag the window to the otherdisplay.
But it appears borderless and fullscreen on the rift anyway - that's an optics limitation.

As for the actual render pipeline:
I create opengl framebuffer objects (FBO: render to texture) and just translate the camera's x axis for the left (and right) eye and render as usual.
Afterwards I do a shader based post-processing effects step using the fbo textures and render to another fbo.

This post-processing would be optional and isn't related to the stereoscopic rendering itself.

Having rendered the left and right fbo they need to be merged into the final display buffer.
This comes in the shape of another shader post processing which is using left and right buffer input textures.
For each half of the display the warping is then applied, ie.:
For each screen pixel coord an input coord is calculated using a warping function (fisheye projection).
It's like raycasting through a lens.


Previously I wasn't using FBOs but had that mentioned optional post-processing by copying to textures.
Which is ok when only doing monoscopic rendering.
Now having FBOs I may think about other odd things like rendering individual instrumentation displays as if they had real displays.
Or add a zoom overlay.


[EDIT: The "fisheye projection" is actually technically called a "barrel warping"]
github.com/hackcraft-de
hackcraft.de
User avatar
hc
LW3D Moderator
 
Posts: 213
Joined: 07 Feb 2011, 10:00
Location: far away

Re: Progress

Postby hc » 06 Jul 2013, 10:24

As for tracking:
I was able to open the HID (human interface device) with testing tools and a minimal c program but always jus got a short data stream which ended after pumping some data.
As the trackers produce a lot of data the is a huge dataflow compared to other HIDs like mice and keyboard.
Too much data?
Data overflow?

On the oculus forms I saw that this may be a bug in some linux versions.
Btw while I can read the forum one may not download attachments by other users (even user's IP source code for linux).
I've tested with Mint 14 and raspian (raspian wheezy I think).

The datastream itself was showing obvious patterns no wonder.
And sensor fusion can be tackled there are be several open source libraries, literature and tutorials which allready solve such problems.
And other developers are on their way allready with putting together libs...

Edit: signal11 hid lib is said to work if anybody is interrested to go ahead.
github.com/hackcraft-de
hackcraft.de
User avatar
hc
LW3D Moderator
 
Posts: 213
Joined: 07 Feb 2011, 10:00
Location: far away

Re: Progress

Postby hc » 04 Aug 2013, 16:25

Just a quick heads-up for now:

Finally (on Mint 15 btw) I got the OpenHMD (http://openhmd.net/) and libvr (http://hg.sitedethib.com/libvr) lib/demos to compile and run for Occulus Rift head tracking.
Both use signal11's hidapi (http://www.signal11.us/oss/hidapi/) [using libusb as for me].

libvr is more intended as a lower-level lib and OpenHMD as a higher level solution lib.
I usually prefer the more dedicated solution which doesn't make asumptions about my requirements -
in this case I ended up with trying OpenHMD in my experimental git branch anyways.
But here again I just use the bare minimum low level features:
I asked OpenHMD to just hand over the quaternion for the head movement and the lib gave them happily with just a small initialization, an update call and a request for the value.

Well, but it will stay in the experimental branch since OpenHMD and hidapi dependencies are not yet readily available everywhere.

Yes, some lag is noticeable with fast head moves and
now it's maybe time for some experiments with "prediction" - or extrapolating values.
Prediction methods are generally well known and not something out of this world.
It's more a value tweaking business because if you overdo it you'll get overshooting once your head movement stops.
github.com/hackcraft-de
hackcraft.de
User avatar
hc
LW3D Moderator
 
Posts: 213
Joined: 07 Feb 2011, 10:00
Location: far away

Re: Progress

Postby dusted » 07 Sep 2013, 00:10

Just pulled from git and tried this in my oculus (on arch linux), assuming it is only rendering, it works REALLY nice!
The graphics are perfect for the rift! Very nice feeling of depth too!
The UI is not working well, but that is a general problem with the rift, one way to amend this may be to use headtracking so to "look down at the hud" but that's potentially a lot of work.
Just wanted to say: Great work!
User avatar
dusted
 
Posts: 83
Joined: 27 Feb 2010, 04:35

Re: Progress

Postby hc » 10 Sep 2013, 23:11

Great! :)
Thanks for trying!

Any remarks about compiling or installing OpenHMD? And or hidapi?
I kind of remember having inconviniences but can't remember which but that may have been with libvr after all - don't know.

I was wondering if anybody would/could try this - given these special requirements.
As for the HUD - that's why I've added the 'h'-key (I think) to hide the 2d hud...isn't really the best solution.
Yeah, now with tracking it could be "fixed" in 3d space and more or it just needs to be at least rescaled to fit into view.

Interesting (or bad) is: As you can see there is a lot of hidden display space which is out-of-view (or kind of wasted display real estate) by design.
github.com/hackcraft-de
hackcraft.de
User avatar
hc
LW3D Moderator
 
Posts: 213
Joined: 07 Feb 2011, 10:00
Location: far away

Re: Progress

Postby Julius » 18 Sep 2013, 18:45

http://enemystarfighter.com/blog/2013/9 ... ns-learned

Some interesting tips in regards to VR huds etc.
User avatar
Julius
Community Moderator
 
Posts: 3297
Joined: 06 Dec 2009, 14:02

Who is online

Users browsing this forum: No registered users and 1 guest