![]() One thing I have yet to fix is in the morgue path, %n should go between the two adjacent forward slashes. Socket_path = "/home/pi/build/crawl-server/1.5.5b1-yiuf/source/rcs",Ĭlient_path = "/home/pi/build/crawl-server/1.5.5b1-yiuf/source/webserver/game_data/", Ttyrec_path = "/home/pi/build/crawl-server/1.5.5b1-yiuf/source/rcs/ttyrecs/", Inprogress_path = "/home/pi/build/crawl-server/1.5.5b1-yiuf/source/rcs/running", Morgue_path = "/home/pi/build/crawl-server/morgue//1.5.5b1-yiuf", Macro_path = "/home/pi/build/crawl-server/1.5.5b1-yiuf/source/rcs/", Rcfile_path = "/home/pi/build/crawl-server/1.5.5b1-yiuf/source/rcs/", It looks something like this: ("dcss-web-1.5.5b1-yiuf", dict(Ĭrawl_binary = "/home/pi/build/crawl-server/1.5.5b1-yiuf/source/crawl", Makeconfig-uni.sh appends to a file called gamesdict.txt which allows you to easily copy in the configuration for each version you make to the config.py file. I'm still working on my unified morgue directory - I don't like that different version try to resume each other's games. I have my main crawl server file in the trunk webserver directory, referencing all the other crawl binary files. Certain checks along the way assure nothing breaks if the download or build fail.įor the purposes of understanding how I have my crawl webserver configured, my home directory looks something like this: ~ The script also understands trunk to download the main development branch. I've also hardcoded in the repos for Gnollcrawl, Yiufcrawl, Gooncrawl, and of course, the main crawl fork./downloadmake-uni.sh x-crawl v1.2 My script knows the GitHub repo to pull X-Crawl from, and can download only the files we need to build the specified version. Running downloadmake-uni.sh with the name of the fork and the version of the fork to install will automatically put things in the right directory.įor example, I want to install version 1.2 of X-Crawl. We're going to need to heavily edit and automate editing of this games dictionary to display all our older/forked versions. You can edit the config.py file to display multiple versions of crawl to play with. Hosting Multiple Versions from the Same Lobby ![]() I can't figure out how to get it to use SDL2 either. The contrib repos it relies on no longer exist. The webtiles version was added in 0.9.Ĭrawl 0.9.2 suffers from web rot. So I thought I would be able to run 0.6.1 as a webserver, but I was looking at the wrong branch. With these, the game compiles and links successfully, however I experienced a crash on D:1 and autoexplore didn't seem to work great. + 486 | CFWARN_L += -Wno-literal-suffix -Wno-deprecated-declarations -Wno-narrowing -Wno-parentheses -Wwrite-strings -Wshadow -pedantic -D_FORTIFY_SOURCE=0 486 | CFWARN_L += -Wno-parentheses -Wwrite-strings -Wshadow -pedantic -D_FORTIFY_SOURCE=0 I ended up suppressing a few more warnings in the makefile to make it work. I identified 3 locations that actually required changes: :2241:53 1.0 :2232:55 1.0 :281:19 return nullptr I decided on a whim that I would try to compile 0.6.1. Code that didn't produce warnings or errors back in 2010 do now. However, older versions of crawl were built with older versions of g++. If you're like me and into the history of the game, you'll want to play some of the older vesions. That means you ran make TILES=y instead of make WEBTILES=y. 22:17:40,390 WARN: #2 P3 Invalid JSON output from Crawl process: -species preselect character species (by letter, abbreviation, or name) 22:17:40,389 WARN: #2 P3 Invalid JSON output from Crawl process: -name character name 22:17:40,388 WARN: #2 P3 Invalid JSON output from Crawl process: -help prints this list of options 22:17:40,387 WARN: #2 P3 Invalid JSON output from Crawl process: Command line options: If your server logs show: 22:17:40,386 INFO: #2 P3 ERR: Unknown option: -webtiles-socket The server starts, and the lobby and game works. Using the webserver is a much better option. Like 60% on all cores, never made it past the splash screen slow. Sudo apt install -y libsdl2-image-dev libsdl2-mixer-dev libsdl2-dev libncursesw5-dev python-pipĭuring testing I tried running crawl over my X-tunneling ssh session, and it was slow AF. We're on a Raspberry Pi 3 B+, so we'll be following the Debian instructions. The first step is to compile crawl, so we'll refer to the instructions here. ![]() ![]() They seem to take the "chroot and yolo" approach to security. Those first instructions look a bit insane. There's only two sets of instructions on how to set up your own crawl server. I wanted my own private crawl server locally on my network, just like CAO or BRO. Running a Dungeon Crawl Server on a Raspberry Pi Running a Dungeon Crawl Stone Soup Server on a Raspberry Pi JTl dr ![]()
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |