Posts by user "CliveL" [Posts: 162 Total up-votes: 0 Page: 7 of 9]ΒΆ

CliveL
September 01, 2012, 17:36:00 GMT
permalink
Post: 7390838
I entirely agree with John Farley's comments. We were planning to fit canards on the second generation SST for exactly those reasons - they also gave a slightly better L/D in take-off climb which was useful for noise abatement, but they only just earned their keep in terms of economics!

Subjects (links are to this post in the relevant subject page so that this post can be seen in context): Noise Abatement

CliveL
September 02, 2012, 09:56:00 GMT
permalink
Post: 7391786
@stilton

I can't give you much detail I'm afraid - as JT said a few posts ago:

No point asking Clive .. he's an aerodynamicist and, hence, only talks in slugs/cubic foot.
Basically the differences lie in the fine details of the structure. To quote from a Googled article:
Safe-life refers to the philosophy that the component or system is designed to not fail within a certain, defined period. It is assumed that testing and analysis can provide an adequate estimate for the expected lifetime of the component or system. At the end of this expected life, the part is removed from service.
whereas:
Fail-safe designs are designs that incorporate various techniques to mitigate losses due to system or component failures. The design assumption is that failure will eventually occur but when it does the device, system or process will fail in a safe manner.
On the UK parts there were detailed features such as crack-stoppers and multiple load paths whereas the French design relied on analysis and testing to establish where and when any failures might be expected to occur. The consequence was that the ultimate life of the airframe was dictated by the number of thermal fatigue cycles accumulated in the Farnborough major fatigue facility divided by the factor of safety demanded by the airworthiness authorities which was conservative because one was really into unknown territory.

Subjects (links are to this post in the relevant subject page so that this post can be seen in context): Fatigue

CliveL
September 02, 2012, 10:58:00 GMT
permalink
Post: 7391877
.. I really shouldn't be as cheeky as I tend to be at times ...
John, you should know that engineers live their lives in an atmosphere of mutual jovial insult

Last edited by CliveL; 2nd September 2012 at 10:58 .

Subjects: None

CliveL
September 02, 2012, 11:02:00 GMT
permalink
Post: 7391881
Just for the sake of argument if the two fleets were still operating surely BA's would be approved for a longer life with the fail safe method of construction ?
No, the aircraft in each fleet were structurally identical. Every aircraft was built with front and rear fuselages made in the UK and central fuselages and wings made in France. Only the final assemblies were specific to each country.

Subjects: None

CliveL
September 02, 2012, 11:06:00 GMT
permalink
Post: 7391887
Don't know much about radio altimeters. First thoughts were that it was some sort of vortex generator, but seems a funny place to have one. Second thoughts were that the 'vane' standing off the surface was there to generate some sort of suction (the 'hole' seems to be on the leeward side) to make sure that the inside did drain in all conditions.

Subjects (links are to this post in the relevant subject page so that this post can be seen in context): Vortex

CliveL
October 27, 2012, 08:40:00 GMT
permalink
Post: 7488997
How about this:



In normal operation (centre picture), the flow in the upper half of the intake was supersonic with a normal shock as required to decelerate to subsonic conditions. In the lower half the flow was decelerated to just sonic by the cowl shock. If the engine demand increased the region of supersonic flow got bigger until it nearly filled the intake (right hand picture).

The small reversed "D" zone just below the bleed slot is the supersonic region. The bleed flow entered the bleed aft of the normal shock.

Last edited by CliveL; 27th October 2012 at 08:44 .

Subjects (links are to this post in the relevant subject page so that this post can be seen in context): Bleed Air

CliveL
October 28, 2012, 07:16:00 GMT
permalink
Post: 7490288
@ peter kent

As you say, a complex subject!

Maybe the missing link is that a plane shock is not the only way to decelerate through Mach 1.0. If the nose of a body is blunt, or if the angle you are trying to turn the flow through is too big then the shock wave becomes detached from the leading edge of the body. The bit of the shock on the 'cusp' is then actually a very strong plane (normal) shock and the flow immediately behind that part is subsonic. In the case of a sharp surface with a large tuning angle this subsonic flow allows air to escape from the high pressure side of the surface to the low pressure side. This would be the case for example if the flow onto the leading edge of an intake hit it at too big an angle.

Supersonic intakes come in two basic guises - external compression and internal compression. The ramjet intakes you have been reading about are the latter type in which all the deceleration/compression takes place inside the intake. In these designs the final compression is through a normal shock situated at the minimum area 'throat' of the intake where the flow is close to Mach 1.0. This flow is delicately balanced and if some engine disturbance causes the shock to move into the converging supersonic bit of the intake the whole shock system can be expelled giving all sorts of problems (inlet unstart). Generally they are used for high Mach numbers where their higher theoretical efficiency and low external/spillage drag count for more than the additional control system complexity and performance requirements.

In external compression intakes (a simple pitot intake would be an extreme example), all the compression is done by a system of shock waves that sit outside the intake. These intakes are less efficient than internal compression intakes and they also spill a lot of air which produces external drag. Usually restricted to low supersonic Mach numbers.

Concorde's intake was a "mixed compression" design which had some features of each type. At low engine mass flow demands the flow coming on to the cowl lip could be at too great an angle to maintain attached shock waves so it behaved a bit like that described earlier. You can see this most clearly in the left hand picture where the lower efficiency and higher spillage can be seen in the graph of efficiency against intake capture (epsilon). In this state the intake behaved more like an external compression type and there was no appreciable final normal shock.

At high engine demand the angle of flow hitting the cowl was such that the shock waves remained attached and the intake functioned more like an internal compression design. Again you can see this in the right hand picture which shows most of the intake throat covered by a normal shock and in the graph where total intake flow (engine plus bleed) is constant.

On condition there was a bit of each, but since it was designed to minimise spillage you cannot see the detachment of the cowl lip shock at the scale of the diagram.

Hope this is helpful rather than additionally confusing!

PS: Looking at the centre picture again, it occurs to me that the curved shock running from the lip back and up to the reversed "D" would actually be normal to the approaching local flow which was being turned by the ramps and the isentropic compression. This would be the shock you are looking for to decelerate the flow to subsonic conditions. In other words the intake was functioning as an external compression design over this part.

Last edited by CliveL; 28th October 2012 at 07:28 .

Subjects (links are to this post in the relevant subject page so that this post can be seen in context): Bleed Air  Intakes

CliveL
February 23, 2013, 08:06:00 GMT
permalink
Post: 7710332
As a Chartered Engineer working on a multi-disciplinary rail project, I am amazed that a project as complex as this was managed across the Channel in the 1960's; how was the Systems Engineering managed - who drove the requirements for the Jet, potential carriers, engineers or politicians?
For sure not the politicians, who could not distinguish between their rectum and middle arm joint so far as aircraft systems were concerned
Once it was decided to go, I would say that the system requirements were largely driven by the difficulty of the task - more a question of finding out how to make it work than of optimising. The overall aircraft requirements were driven by the engineers, but criticised by the potential customer airlines in regular meetings.

Safety requirements were specified in a completely new airworthiness code - a sort of comprehensive set of special conditions, which were generally more severe than the subsonic codes of the time. Concorde, for example, was, AFAIK , the first civil aircraft to be certificated against the requirements that now exist as 25.1309.
But nobody really knew what to write for supersonic flight and, in particular, the transition from subsonic, so to some extent one made it up as one went along, using prudent common sense and engineering judgement. Fuel system transfer rates for example had to match a requirement that it should be possible to abandon the acceleration at any point and return safely to subsonic conditions - and the deceleration was much quicker than the acceleration!

wind modelling played a big part in the development of the aerodynamics, how big did the models go? Did you have the luxury of testing a full-scale model? or maybe full-scale parts or sub assemblies?
Well some people might say we had four full scale models - two prototypes and two pre-production models Supersonic testing mainly at 1/30 scale; low speed 1/18. The biggest model was a 1/6 scale half model used mainly for icing tests. Isolated intake tests, IIRC, about 1/10 scale, but we did have a full scale intake operating in front of an Olympus 593 at Mach 2.0 in Cell 4 at NGTE Pyestock.

Subjects (links are to this post in the relevant subject page so that this post can be seen in context): Olympus 593

CliveL
October 14, 2013, 14:06:00 GMT
permalink
Post: 8098479
The ORIGINAL design for the reheat was done by SNECMA, but due to them getting into all sorts of trouble with the fuel injection system and flame stabilisation, Rolls-Royce baled them out, and it became a Rolls-Royce/ SNECMA design.
ref heritageconcorde.com

Does anyone have any details on the 'joint' development alluded to above?


Attended a Powerplant Design Group reunion earlier, so I thought I would try to get an answer from somebody who really knows ....

The problem apparently was that flame stabilisation operating in "contingency" rating was sensitive to the point that every engine had to be checked, so there was a lot of engine plus reheat testing, most of which was done at Patchway. The solution was addition of some form of 'spike' at various points on the spray bar (my informant wasn't very specific). It sounded like a sort of vortex generator cum chine that gave the flame somewhere to latch onto. The development process was, as you suggested, a joint activity.

Subjects (links are to this post in the relevant subject page so that this post can be seen in context): Afterburner/Re-heat  Rolls Royce  Vortex

CliveL
October 17, 2013, 10:15:00 GMT
permalink
Post: 8103250
Peter

I've just started reading it, and it's pure Ted (and Ann)

Do buy it - it is probably the most amusing (and human) book on Concorde you will ever read. Best seven quids worth I have spent for a long time.

Update: I have now finished it - I couldn't put it down. Definitely autobiographical, but worth buying for the Concorde bits alone. Maybe I'm biased as I share many of his memories. Perhaps UK readers will appreciate the non-Concorde bits more.

Last edited by CliveL; 17th October 2013 at 18:46 . Reason: update

Subjects: None

CliveL
October 18, 2013, 18:12:00 GMT
permalink
Post: 8106029
Dozy

Even when Concorde entered production, the most complex digital displays available to aviation were of the 7-segment LED type (as used in the Apollo Guidance Computer), and they were both wildly expensive and of limited use.
Yeah, well when we put a digital computer to generate the AICS laws that was NEW man!



Ergonomically speaking, both engineers and pilots of the era write of Concorde's flight deck being the best possible balance of form and function available at the time - sure it looks cluttered to the modern eye,
Again, no digital multifunction displays on offer in those days

It's worth bearing in mind that even those not particularly well-disposed to Airbus will grudgingly admit that the flight deck ergonomics on those types are extremely good - and a lot of the lessons learned were from cramming all that information into Concorde's limited space.
Errrr no, I don't think so. Concorde's flight deck was done at Filton and we had no involvement in the Airbus designs in that area.


I have to thank EXWOK for explaining the windows - but I'll add the more prosaic reason that you don't need a particularly large window to see the curvature of the Earth in all its splendour - which is for the most part all you'd be seeing during the flight!
Exwok's remark is not quite right IIRC. Certainly the window size was dictated by pressurisation failure, but one couldn't maintain cabin pressure with two windows failed - the design case was to get to a breathable altitude before you killed too many passengers! Also, there is very little to see when you have a delta wing under you.

While Concorde herself never recouped the development money granted by the governments of the UK and France, the infrastructure and R&D her development put in place paved the way for the Airbus project
Ummm - most participants reckoned that the Concorde infrastructure showed the way not to do it, and besides the early Airbuses were developed in parallel with the later stages of Concorde development. You have a point where R&D is concerned though - several technologies developed for Concorde found their way onto the subsonic fleet, not the least being the probability approach to system certification.

Subjects (links are to this post in the relevant subject page so that this post can be seen in context): AICS (Air Intake Control System)  Airbus  Filton  Pressurisation

CliveL
October 23, 2013, 12:49:00 GMT
permalink
Post: 8113333
I know Concorde engines were FADEC. Were the Thrust levers gated like on Airbus? I noticesd cocncorde pilots shovved the levers forward for take off thrust..not the gentle easing forward like most other turbojets..why was there this need?Did they take too much time to spool up?
The engine was electrically signalled, but it wasn't FADEC; the control system(s) were analogue.
I suspect the zero bypass Ol 593 would take less time to spool up than todays high bypass engines.

Subjects (links are to this post in the relevant subject page so that this post can be seen in context): Airbus  Olympus 593

CliveL
January 08, 2014, 06:53:00 GMT
permalink
Post: 8252886
msbbarratt

I often wonder though; given that all flight dynamics on all aircraft types can (presumably) be expressed by systems of differential equations, are we missing a trick? Implement the equations in analogue but have a digital wrapper around it to provide the modern supervisory functions? If it could be done it would save weight, power, cost; an analogue circuit could be made really, really small these days.
That of course is essentially what was done on the intake control system. The basic analogue "inner loop" was retained to do the actuation but it operated to non-linear laws and limits defined by a digital system.

Subjects: None

CliveL
February 20, 2014, 07:09:00 GMT
permalink
Post: 8328956
Static ports

SSD

I'm afraid I can't tell you what they actually do, but I am pretty sure they aren't part of the anemometry because those static ports are "pepperpots" mounted on specially machined and jigged flat plates. This was necessary because static pressure at Mach 2 is sensitive to local skin waviness.

Do you have a photo?

Subjects: None

CliveL
February 22, 2014, 12:04:00 GMT
permalink
Post: 8333372
SSD

I know they aren't anything to do with AICUs but seeing where they are located and looking at Bellerephon's diagram I would think they are reference static ports for the air conditioning system - needed to monitor differential pressure.

Dude where are you when we need you?

Subjects: None

CliveL
March 04, 2014, 16:51:00 GMT
permalink
Post: 8352423
Sorry Nick, you are out of luck on that one. It was tailor made to optimise cruise drag. Varied from 3% thick at the root to 1.8% near the tip, but the camber and twist don't fit any recognisable standard section.

PM me and I will send you something that might help

Subjects: None

CliveL
June 29, 2014, 15:12:00 GMT
permalink
Post: 8542535
On p86 he says 'It followed the idea of multi-vane auxiliary air inlets into history."

Anyone know the story on these inlets?

They were an attempt to avoid the mechanical complexities of the prototype double hinged 'barn door' combined dump door/auxiliary intake by having several 'blow-in' vanes set in the door which were locked when the door was operated as a dump door.
Had their own set of mechanical problems and the idea was abandoned in favour of a single blow-in door (production solution)

Subjects: None

CliveL
April 05, 2015, 07:55:00 GMT
permalink
Post: 8933615
@EXWOK


There was a certification requirement for descent time from FL600 down to FL100 if I recall correctly. Can't remember the value though. In flight reverse was developed to trim some fraction of a minute off the time to get inside the requirement


@ a_q

Not sure what you mean by a "leaky" intake. At about 2.2M the first shock would hit the intake lower lip and from that point on the total intake mass flow was frozen. Increased engine mass flow could only be obtained by reducing bleed flow and that gave higher engine face flow distortions driving the engine towards surge and lower intake recovery. So engine mass flow was effectively fixed also.
Then the amount of "dry" fuel which could be added was limited because the higher Mach number increased the engine entry temperature but the maximum turbine entry temperature was fixed.
You could add thrust by using reheat, but you would not get as much as you would like because the final nozzle, being designed for 2.0M would be too small for optimum efficiency at higher Mach numbers.
Overall, IIRC we got to 2.23M in flight test. If you pushed me I would say it might be possible with reheat etc to get to 2.25 or 2.26M, but it would be a blind guess!

Subjects (links are to this post in the relevant subject page so that this post can be seen in context): Afterburner/Re-heat  Bleed Air  Engine surge  Nozzles

CliveL
April 08, 2015, 07:02:00 GMT
permalink
Post: 8936540
Leaky

@a-q


Ah yes, page 55 from 4 years ago ...... It's my age you know!


What threw me was your reference to a leaky intake - on 101 it was all the nacelle aft of the intake that leaked not the intake itself

Subjects: None

CliveL
April 09, 2015, 07:24:00 GMT
permalink
Post: 8937544
@stilton


Yes they did. I tried to post a photograph but the Dropbox link doesn't seem to work any more (neither does the "quote" icon)

Subjects: None