Page 7 of 11 FirstFirst ... 34567891011 LastLast
Results 61 to 70 of 102

Thread: Reviews of restored numbers

  1. #61
    Join Date
    Mar 2010
    Posts
    649

    Default

    SAGE was the computer that produced 3D effects in Laserium shows beginning in 1987. It was a separate thing from the Choreographics system. An example of sage graphics is the yellow and green zig zag lines in my tribute to L.A. Woman video. Until recently little was publicly known about the SAGE. Fortunately an individual or two, one of whom was one of the designers of the platform, as well as both being Laserium show directors / choreographers, have generously communicated with me regarding this recently.

    This video shows the KQO with spiral ramp on the Z axis and the 3D computations I added yesterday. In this segment the emulator is in bit box mode, which means the 351 data is replaced by a static frame of 351 data that enables the desired system functions such as beams on and fixed rotation offsets.

    Video here:
    https://youtu.be/ekbCxZhv0ts

  2. #62
    Join Date
    Mar 2010
    Posts
    649

    Default

    One minor code change later and my mind is blown. Moving the whole 6b universe into a 3D universe gives immediate and amazing results.
    Attached Thumbnails Attached Thumbnails 3D_3.jpg  


  3. #63
    Join Date
    Mar 2010
    Posts
    649

    Default

    Here's the video:
    https://youtu.be/vjuWaii3WmI

  4. #64
    Join Date
    Sep 2014
    Location
    Colorado USA
    Posts
    866

    Default

    Quote Originally Posted by Greg View Post
    Very nice Greg!
    ________________________________
    Everything depends on everything else

  5. #65
    Join Date
    Mar 2010
    Posts
    649

    Default

    lasermaster1977, I was bowled by the photo of your quad mount you posted a bit ago. Elegance and efficiency... If laser show was a weapon it would look like that. Do I understand correctly you built projection systems which included your appleII work, and you also performed live planetarium shows in the 1980s?

  6. #66
    Join Date
    Sep 2014
    Location
    Colorado USA
    Posts
    866

    Default

    Quote Originally Posted by Greg View Post
    lasermaster1977, I was bowled by the photo of your quad mount you posted a bit ago. Elegance and efficiency... If laser show was a weapon it would look like that. Do I understand correctly you built projection systems which included your appleII work, and you also performed live planetarium shows in the 1980s?
    Thanks for your kind words. And yes, guilty as charged. I started designing & doing laser cloud and open-loop scanning projectors and analog consoles for short HeNe effects in regular astronomy related planetarium shows around late 1975 or early '76 thru '78. In the summer of '77 I attended a national planetarium conference in Boulder, CO and saw my first Lasaerium. That's when I knew with certainty it was mostly just simple and sophisticated scanned Lissajous images, stars and other background effects. That grew into my contracting out as Projected Imagery, Inc. to do full-fledge live RYGB laser shows at the same planetarium beginning the very last of Dec. 1978 thru '81 using more sophisticated consoles. The first Apple II DAC came along in early '79. That's about the time I started getting more and more corporate and other live performance work that lasted thru the very end of 1988. I did occasional abstract and graphics programming for two local planetariums for the next4-5 years, and after the millennium did some live laser accompaniment (using one of my latest Apple IIe systems and 1W mixed-gas ion lasers with G120Ds & AOMs I sold to those planetariums) for live stage bands at a Ft. Worth venue run by another former laserist and good friend. Those were my last true live performances. The quad mount you referenced was the result of my determination to mount 4 pairs of galvos as close together as physically practical for a variety of reasons. Good times.
    ________________________________
    Everything depends on everything else

  7. #67
    Join Date
    Mar 2010
    Posts
    649

    Default

    Did you and your projection systems handle all choreography between the asymmetrical use of each of the four channels live? Did you have anything like cues on the computer that would occur in a pre planned or synchronized way with the music?

  8. #68
    Join Date
    Sep 2014
    Location
    Colorado USA
    Posts
    866

    Default

    Quote Originally Posted by Greg View Post
    Did you and your projection systems handle all choreography between the asymmetrical use of each of the four channels live? Did you have anything like cues on the computer that would occur in a pre planned or synchronized way with the music?
    Yes! Chris, my close friend, co-conspirator, EE and I used an extremely accurate time reference card in the Apple IIe used for cue timing, and manual/auto mode cue execution. That means that when not "running" a canned script file, manual command, manual console interaction entry mode was possible to . Manual mode card command mode, however, was not used in "show presentations", but manual or partial manual console mode was.

    The four co-processor XYZ (Z=on/off blanking) display cards once uploaded with their "operational brains", 16 scale and shape routines, 8 waveform tables and 13 initial abstract and/or line art images loaded into 13, 2K image buffers, the display cards just sat like a good little doggie waiting for a command to execute. These cards could be simultaneously or independently sent display commands, uploaded new scale and shape routines, waveform tables and/or new XYZ images, on the fly...as fast as every .0625 seconds.

    In place of the usual 80-column/memory expansion card we used a 1.5MB memory expansion card to provide 80-column monitor display and that could also be used as 6 RAM Drives with more than 5 times the storage capacity of the typical 140K Apple DOS3.3 floppy). (And NO, ProDOS was not a good fit for this.) Fetching show content preloaded onto the RAM Drives and uploading content to the co-processor cards were many, many time faster that getting those files from the normal floppy drives.

    When preparing to program a show or event, I had already spent many hours listening to each sound event or music on the soundtrack and I a good feel for when I wanted "something" to happen as the soundtrack played.

    To derive a close time-based cue script that contained all the raw show clock timed events prior to programming show content, a utility program was loaded and running, waiting for a key press. I mainly used custom CDs I burned with the soundtrack. I'd hit the CD start button while at the same time hitting the designated Apple IIe key to start the program clock. Time 00:00.0 was written to an ASCII buffer. The first down beat heard on the sound track another key press caused the "current" program clock time to be saved to the ASCII text buffer. There was a key press for each additional show event until all the time cues had been defined. The text file would then be saved to disk. These resulting text files could be EXEC'd into an Assembler Text Editor for editing and refining. I always used my Sony Compact Disc CD Player from that time. From the time the CD play button was pressed to when the start of the soundtrack started was always consistent and accurate as all get-out.

    I'd wind up with a text file something like the BEFORE shown below. Card and/or Console commands and parameters would then be appended to each time line like shown in AFTER:

    Click image for larger version. 

Name:	LasermasterBTime-2.jpg 
Views:	1 
Size:	597.8 KB 
ID:	61246

    Show script text files were commonly created a section of soundtrack at a time then appended together.

    The Apple IIe's primary job on scripted playback was to start reading the time cue script, load the first "cue time" into a temporary buffer and compare that time to the time reference card's time relative to the show start. When they agreed, the Apple would send the associated scripted card(s) commands to the co-processor(s). The co-processor XYZ display cards would then execute the commands and magic would come out of the DACs. There were 4 DACs, XY and +-XScale, +-YScale (or modulation) per card, plus one TTL blanking line. One card per color, RYGB or RGB. In the case of 3 cards (RGB) the fourth card was free to be used as a modulation or special use source.

    The Apple IIe's secondary job was to execute all scripted laser console TTL control lines for Interlocks, beam stops, motor or relay On/Offs and TTL controlled analog two-bus, four channel XYZ mixer switching to a 1 of 56 TTL decoder. There were a total of 56 control lines, the first 24 were for driving devices with +13.5VDC @ 250mA. The remaining 32 lines were analog switching/DAC switching related.

    Although it was possible to create a 100% automated show...were is the fun in that. My entire vision was based on enhancing as much as possible a human interactive show performance by making sure the most important things were never left to chance. So the console had manual overrides for human control actions, manual 4 quadrant RYGB assignable joystick, additional manual assignable and controlled AM modulators to modulate co-processor XYZ DAC outputs.

    The console included a high speed XYZ, 12 in, 3 out oscilloscope multiplexer that displayed all co-processor card outputs.

    Of course, some details I'm leaving out for brevity...yeah, right, like this was brief.
    ________________________________
    Everything depends on everything else

  9. #69
    Join Date
    Mar 2010
    Posts
    649

    Default

    I would hazard that a design pattern exists for the one co-processor with uploaded operational brains per channel solution. Given the half century difference in hardware choices, the 6b emulator is basically what you just described. The button pushing to create a list of cues and then adding commands and parameters doesn't change over time. Nor does the button push that creates the sync and after that everyone flies off their gyro. That gives me pause to re-appreciate how weirdly evolutionary the 6b was, as it in no way used that pattern. A buzzing little city of attractions accessible on busses, and the inseparability of mag track for the sync. Not the same thing at all.

    Gets me wondering what pattern implementing a 6b emulator on four radiators would be.

  10. #70
    Join Date
    Sep 2014
    Location
    Colorado USA
    Posts
    866

    Default

    Quote Originally Posted by Greg View Post
    I would hazard that a design pattern exists for the one co-processor with uploaded operational brains per channel solution. Given the half century difference in hardware choices, the 6b emulator is basically what you just described. The button pushing to create a list of cues and then adding commands and parameters doesn't change over time. Nor does the button push that creates the sync and after that everyone flies off their gyro. That gives me pause to re-appreciate how weirdly evolutionary the 6b was, as it in no way used that pattern. A buzzing little city of attractions accessible on busses, and the inseparability of mag track for the sync. Not the same thing at all.

    Gets me wondering what pattern implementing a 6b emulator on four radiators would be.
    I'm not sure I follow what you mean about "a design pattern exists...", except you are also saying that the 6b is similar. My apologies for not having a clear understanding of the 6b or what you mean by "the half century difference in hardware choices". Do you mean "back then" compared to what you have at your disposal today?

    But, wondering about "possibilities" is always a good thing!
    ________________________________
    Everything depends on everything else

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •